Use of tensor product in calculation of vector projection

1.2k Views Asked by At

In a textbook I stumbled upon this statement:

The projection $(\mathbf v \mathbf · \mathbf r)\mathbf r$ can be replaced by the tensor product $(\mathbf r ⊗ \mathbf r)\mathbf v$

Where $\mathbf r$ is unit vector, and $\mathbf v$ is arbitrary vector

Authors didn't explain why this is true, assuming that it should be obvious for reader?

Anyway, I know that tensor (outer) product of vector on itself should produce matrix like this:

$$ \left[ \begin{array}{ccc} r_x^2&r_xr_y&r_xr_z\\ r_yr_x&r_y^2&r_yr_z\\ r_zr_x&r_zr_y&r_z^2 \end{array} \right] $$

So, multiplication should give us result like this:

$$ \left[ \begin{array}{ccc} r_x^2&r_xr_y&r_xr_z\\ r_yr_x&r_y^2&r_yr_z\\ r_zr_x&r_zr_y&r_z^2 \end{array} \right] \left[ \begin{array}{c} v_x\\ v_y\\ v_z \end{array} \right] = \left[ \begin{array}{ccc} r_x^2v_x+r_xr_yv_y+r_xr_zv_z\\ r_yr_xv_x+r_y^2v_y+r_yr_zv_z\\ r_zr_xv_x+r_zr_yv_y+r_z^2v_z \end{array} \right] $$ This is as far as I can go. I'm very inexperienced in advanced math like tensors (basically only know how to calculate it). How is this suppose to give us $\mathbf v$ projection onto $\mathbf r$? Can somebody explain this without too much of advanced math, or it's not possible?

1

There are 1 best solutions below

2
On BEST ANSWER

So we're going to assume that $\mathbf{r}$ is a unit vector or equivalently as a column vector that $\mathbf{r}^T\mathbf{r} = 1$ with $\mathbf{r}^T$ the transpose of $\mathbf{r}$. Now the tensor product is the outer product $\mathbf{r} \mathbf{r}^T$ and we need to show this matrix will project any vector $\mathbf{v}$ onto the subspace generated by $\mathbf{r}$. This will in fact be an orthogonal projection so let's prove it.

First, note that $(\mathbf{r} \mathbf{r}^T) \mathbf{v}=\mathbf{r}(\mathbf{r}^T \mathbf{v)}$ because matrix multiplication is associative and that $\mathbf{r}^T\mathbf{v}$ is the dot product of $\mathbf{r}$ and $\mathbf{v}$ so it is a scalar and thus $\mathbf{r} (\mathbf{r}^T \mathbf{v})$ is a scalar multiple of $\mathbf{r}$ so it is in the subspace generated by $\mathbf{r}$. Now since the author is using the unit vector I assume they mean the orthogonal projection so all that's left is to ensure that $\mathbf{r} (\mathbf{r}^T \mathbf{v})$ is orthogonal to $\mathbf{v}-\mathbf{r} (\mathbf{r}^T \mathbf{v})$. To do this we need one more result which is that $(\mathbf{r}\mathbf{r}^T)^T=(\mathbf{r}^T)^T\mathbf{r}^T=\mathbf{r}\mathbf{r}^T$ or said another way $\mathbf{r}\mathbf{r}^T$ is symmetric. Now we calculate the dot product of these two vectors to get $(\mathbf{r} \mathbf{r}^T \mathbf{v})^T(\mathbf{v}-\mathbf{r} \mathbf{r}^T \mathbf{v})=(\mathbf{v}^T\mathbf{r}\mathbf{r}^T)(\mathbf{v}-\mathbf{r} \mathbf{r}^T \mathbf{v})=(\mathbf{v}^T\mathbf{r}\mathbf{r}^T\mathbf{v})-(\mathbf{v}^T\mathbf{r}\mathbf{r}^T\mathbf{r}\mathbf{r}^T\mathbf{v})$ and now we use that $\mathbf{r}^T\mathbf{r}=1$ to arrive at $=(\mathbf{v}^T\mathbf{r}\mathbf{r}^T\mathbf{v}) - (\mathbf{v}^T\mathbf{r}\mathbf{r}^T\mathbf{v})=0$ showing that they are indeed orthogonal showing that $\mathbf{r}\mathbf{r}^T\mathbf{v}$ is the orthogonal projection of $\mathbf{v}$ onto the subspace generated by $\mathbf{r}$.