Orthogonal Projection in Hilbert Spaces Proofs

48 Views Asked by At

It is given that V is an N-dimensional inner product space (standard inner product), and $S_2$ be a subspace in V . $Q = \{p_1.p_2,...p_M\}, M < N$, is a basis for $S_2$. The problem is to find a solution to $\underset{\hat{y}}{\operatorname{argmin}} \lvert\lvert y-\hat{y} \rvert \rvert$, where $\lvert\lvert .\rvert \rvert$ is the inner product induced norm, $y \in V$ and $\hat{y} \in S_2$.

Prove that the solution to the above problem must satisfy,

i- $ (y-\hat{y}) \perp S_2 \\$

ii- $ (y-\hat{y}) \perp \hat{y}\\$

iii-$ (y-\hat{y}) \perp p_j$, $ j = 1,...,M$

$\textbf{Attempt:}$

Now I was able to prove i- and ii- using Pythagoras Theorem and the Triangle Inequality:

$$\lvert\lvert y-\hat{y_0} \rvert \rvert^2 \leq \underbrace{\lvert\lvert y-\hat{y_0} \rvert \rvert^2}_{a} + \underbrace{\lvert\lvert \hat{y_0} - \hat{y}\rvert \rvert^2}_{b}$$

where a and b are perpendicular.

Using Pythagoras Theorem we have:

$$\lvert\lvert y-\hat{y_0} \rvert \rvert^2 \leq \lvert\lvert y - \underbrace{\hat{y_0} + \hat{y_0}}_{0} - \hat{y} \rvert \rvert^2$$

This gives us: $$\lvert\lvert y-\hat{y_0} \rvert \rvert^2 \leq \lvert\lvert y- \hat{y} \rvert \rvert^2$$

where, $\hat{y} = \hat{y_0} = \sum_{i=1}^{M}c_ip_i$ if we want the minimization.

Therefore we have: $( y-\hat{y} ) \perp \hat{y} \in S_2$ which means $ (y-\hat{y} ) \perp S_2 \\$. Also for it to be orthogonal to a subspace it's inner product with each and every vector $y$ in that subspace is equal to zero.

Now how should I approach iii- and is my proof for i- and ii- correct?

1

There are 1 best solutions below

0
On BEST ANSWER

Ok it's easier to answer directly than to keep on with the comments, so here it is. Given $y$ you can write it as $y = \sum_{i=1} ^N c_i p_i$ for some $c_i$ coordinates of $y$ with respect to the basis $\{p_i\}_{i=1} ^N$. You want to find a vector $\hat{y}$ in $Span \{ p_1, \dots, p_M\}$ such that $||y-\hat{y}||^2$ is minimum. Now forget about the $\{p_i\}$ and take another basis of $S_2$ composed of orthonormal vectors, and after that complete this basis to a orthonormla basis of the whole space. If you wish you can do this starting with $\{p_i\}$ by the so called Gram-Schimdt procedure, but whatever. What is important is that now you have a basis $\{q_j\}$ such that its vectors are normal to each other and of unitary norm. Moreover, $S_2 = Span \{ q_1 , \dots, q_M\}$. At this point, rewrite $y$ in these new coordinates to get $y= \sum_{j=1} ^ N d_j q_j$.

The reason to choose an orthonormal basis is that with such a choice you have that

$$ ||y||^2 = \sum_{j=1} ^N d_j ^2 . $$

Of course, since you can modify only the first $M$ components by subtracting an element of $S_2$ to $y$, the minimum is attained for $\hat{y} = \sum_{i=1} ^M d_i q_i$. At this point it's easy to conclude. $y - \hat{y}$ lives in the orthogonal complement of $S_2$ by construction (i.e. $Span \{q_{M+1}, \dots, q_N \} \perp S_2$).