If $v \in V$ prove that there exist an element $u \in W$ such that $v - u$ is orthogonal to $W$.

193 Views Asked by At

Let $V$ a $R$-vector space with inner product $\langle,\rangle$, $w_1,w_2,\ldots,w_s \in V$ orthonormal vectors and $W$ the subspace of $V$ generated by $\{w_1,w_2,\ldots,w_s \}$.

If $v \in V$ prove that there exist an element $u \in W$ such that $v - u$ is orthogonal to $W$.

Well i think this problem it is simple but i can write a correct prove the idea that i have is that i need to prove that there exist an element $u \in W$ such that $v - u \in W^\bot$ and this the orthogonal complement of $W$, i.e, $W^{⊥} = \{ x \in V \mid \langle w,x\rangle = 0\ \forall w \in W \}$.

Someone can help me to understand this correctly, plz.

3

There are 3 best solutions below

0
On BEST ANSWER

Extend $w_1,\dotsc,w_s$ to an orthonormal basis $w_1,\dotsc,w_s,w_{s+1},\dotsc,w_n$ of $V$ (this is possible because Gram-Schmidt). For $v\in V$ write $v=\sum\lambda_i\cdot w_i$ and let $$ u=\lambda_1\cdot w_1+\dotsb+\lambda_s\cdot w_s $$ Then for $1\leq k\leq s$ we have \begin{align*} \langle v-u,w_i\rangle &= \left\langle \sum_{i=1}^n \lambda_i\cdot w_i-\sum_{i=1}^s\lambda_i\cdot \lambda_i\cdot w_i , w_k \right\rangle \\ &= \left\langle\sum_{i=s+1}^n\lambda_i\cdot w_i,w_k\right\rangle \\ &= \sum_{i=s+1}^n\lambda_i\cdot\left\langle w_i,w_k\right\rangle \\ &= 0 \end{align*} Do you see why this implies $\langle u-v,w\rangle=0$ for $w\in W$?

0
On

This is asking about the existence of an orthogonal projection and best approximation in an inner product space. In a finite (or closed) subspace there is always an element $u$ minimizing the norm $\|v-u\|$. Then $x=v-u$ is indeed orthogonal to any element $w\in W$. Otherwise we could improve upon $u$ by choosing our new "best approximation" to be $u+w\frac{\langle x,w\rangle}{\|w\|^2}$: $\|x-w\frac{\langle x,w\rangle}{\|w\|^2}\|^2=\|x\|^2-\frac{\langle x,w\rangle^2}{\|w\|^2}<\|x\|^2$.

More on the best approximation and norm convexity: Every Hilbert space is uniformly convex. Uniform convexity leads to both uniqueness and existence of the best approximation. Every uniformly convex space is strictly convex. Strict convexity is enough for uniqueness.

0
On

HINT:

$$u = \sum_{i=1}^k \langle v, w_i \rangle \cdot w_i $$