how to find the orthogonal projection of u onto v

897 Views Asked by At

I would love some help with a question I dont know how to answer.

Let $ u=\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}^T,V= \begin{bmatrix} 1 & i \\ -i & 1 \\ 1 & 0 \end{bmatrix} $ and $K=\{Vx:x\in C^2\}$

Find the orthogonal projection of $u$ unto $K$.(note : $V(V^HV)^{-1}V^H$ is not a real matrix).

Please explain so I could understand the steps since it supposed to be easy. Thanks

2

There are 2 best solutions below

0
On BEST ANSWER

You’re working with complex vector spaces, so I don’t see why there would be a problem with the projection matrix having complex entries. Carrying through the computation, you should get $$V(V^HV)^{-1}V^H = \begin{bmatrix}\frac12&\frac i2&0\\-\frac i2&\frac12&0\\0&0&1\end{bmatrix}$$ and the result of applying this matrix to $u$ is just its first column.

This is a rather tedious computation. You can save yourself some work by computing the projection directly in one of several ways. $K$ is spanned by the columns of $V$, which are obviously linearly independent (if they weren’t, then $V^HV$ wouldn’t be invertible). If you don’t happen to notice that $K$ is also spanned by $(i,1,0)^T$ and $(0,0,1)^T$, you can produce an orthogonal basis of $K$ by applying one iteration of the Gram-Schmidt process and use that basis of $K$ to compute the projection directly via the usual projection formula. However, this other basis is more convenient: $(1,0,0)^T$ and $(0,0,1)^T$ are obviously orthogonal, so the orthogonal projection of $u$ onto $K$ is simply its orthogonal projection onto $(i,1,0)^T$. Alternatively, you might notice that $(-i,1,0)^T$ is orthogonal to both columns of $V$, so orthogonal projection onto $K$ is equal to orthogonal rejection from this vector, i.e., compute the orthogonal projection of $u$ onto $(-i,1,0)^T$ and subtract that from $u$.

1
On

Notice that $K \in \mathbb{C}^{3}$, since: $ K= \begin{bmatrix} 1&i\\ -i & 1\\ 1 & 0 \end{bmatrix} \begin{bmatrix} x_{1}\\ x_{2} \end{bmatrix} = \begin{bmatrix} x_{1} + i x_{2}\\ x_{2} - i x_{1}\\ x_{1} \end{bmatrix} $.

We can take the orthogonal projection of $u$ onto $K$: $proj_{[K]}(u) = \dfrac{u \cdot K}{K \cdot K} K = \dfrac{x_{1} +i x_{2}}{x_{1}^2} K$