Finding this vector given that the vector is a projection onto a subspace

263 Views Asked by At

Sorry for the weird/confusing notation, but the course I'm doing right now is using this..

The question is:
Recall that that projection of $y$ onto a vector subspace $V$ of $\Omega$ is a vector $\hat{y}\in V$ such that $(y-\hat{y})$ is orthogonal to all vectors in $V$. In particular, if the vectors $v_1,\ldots,v_k$ span the whole vector space $V$, then
$$(y-\hat{y})^T v_i = 0$$ for all $i$.
Let $y = (7,0,2)^T$ and $v_1 = (1,1,1)^T,v_2 = (2,-1,-1)^T$. Find $\hat{y}$.

So using that the dot product is equal to zero for all $i$, I got the following equations
(let $\hat{y} = (\hat{y_1},\hat{y_2},\hat{y_3})^T$)

$$\hat{y_1} + \hat{y_2} + \hat{y_3} = 9 \\ \hat{2y_1} - \hat{y_2} - \hat{y_3} = 12$$

I can reduce this, but it seems that there's infinite solutions... did I misunderstand the problem?

3

There are 3 best solutions below

0
On BEST ANSWER

You're missing the fact that $\hat y$ needs to be an element of the subspace; this additional piece of information needs to be accounted for somehow.

One way to approach this is to instead write $\hat y$ as $$ \hat y = x_1 v_1 + x_2 v_2 $$ for scalars $x_1,x_2$. From there, setting those same dot-products equal to zero gives you two equations that can be solved for $x_1$ and $x_2$.

0
On

We can represent the subspace spanned by $v_1$ and $v_2$ as the column space of the matrix,

$$ A = \begin{bmatrix}1&2\\1&2\\1&-1\end{bmatrix} $$

Now, the projection will lie in the column space of $A$. Let the projection $\hat{y} = Ax$ for some $x$. Note that $Ax$ is a linear combination of the columns of $A$.

Now, $y-\hat{y}$ should be orthogonal to all the vectors in $C(A)$. This means that $y-\hat{y}$ must lie in the orthogonal complement of $C(A)$ which is nothing but $N(A^T)$ (Nullspace of $A^T$).

Therefore, $y-\hat{y}$ satisfies,

$$A^T (y-\hat{y}) = 0 \implies A^TAx = A^Ty$$

Solve for $x$ and then compute $\hat{y} = Ax$.

0
On

The projection formula for projecting a vector $y$ onto a vector $v$ is given by $$proj_{v}(y) = (\frac{v\cdot y}{v\cdot v})v$$ Moreover, for vector spaces $V$ with an orthogonal basis $\{ v_{1},\dotsc,v_{n}\}$ to find the projection of $y$ onto $V$, you can just find the projection of $y$ onto each $v_{i}$ and then take the sum (1).
In your case, the basis vectors are orthogonal, so you can just find the projection onto each vector, and then take the sum. So we have $$ proj_{v_{1}}(y) = (\frac{v_{1}\cdot y}{v_{1}\cdot v_{1}})v_{1}=3v_{1} $$ and $$ proj_{v_{2}}(y) = (\frac{v_{2}\cdot y}{v_{2}\cdot v_{2}})v_{1}=\frac{5}{3}v_{2} $$ Now, if you take $\hat{y} = 3v_{1}+\frac{5}{3}v_{2}$, you'll find that $$ (y-\hat{y})\cdot v_{i} = 0 $$ for $i=1,2$. Personally, I find the wikipedia page on Gram-Schmidt Process quite helpful for this sort of thing.
Also, the results that I've mentioned (i.e., (1)), you can find in most linear algebra texts in the chapter on projection and orthogonal projections.