Writing a vector as the sum of orthogonal vectors

1.4k Views Asked by At

At the start of a proof of the Cauchy-Schwarz inequality, my lecturer wrote down the following statement:

Let $V$ be an Inner Product Space with underlying field $\mathbb{F}$, then $$ \forall\ \ x, y \in V,\ \ \exists \ \ w \in V,\ \ \lambda \in \mathbb{F} \ \ such \ \ that $$

$$ x = \lambda y + w \ \ and\ \ \langle w,y \rangle = 0 $$

Is this an obvious statement (I can't see that it is myself) and why is it the case?

2

There are 2 best solutions below

0
On BEST ANSWER

Let us think about it. You want $w=x-\lambda y$ to be orthogonal to $y$. That would be $$ 0=\langle x-\lambda y,y\rangle=\langle x,y\rangle-\lambda\langle y,y\rangle. $$ So $\lambda =\frac{\langle x,y\rangle}{\langle y,y\rangle}$ does the deed, with $w=x-\lambda y$.

0
On

Of course, if $y = 0$, you should have no trouble doing this (just set $w = x$). Let's focus on the case in which $y \neq 0$.

Intuitively, the idea is as follows:

Given a vector $y$, we'd like to be able to write $x$ as the some of two components: one which is parallel to $y$, and one which is perpendicular. That is, we'd like to be able to write $x = x^{||} + x^\perp$ where $x^{||} = \lambda y$ and $\langle x^\perp,y\rangle = 0$. How could we do this?

It's easy, if we use what we know about vector spaces with inner products. Starting with $y$, we could come up with an orthonormal basis for $V$ (using the Gram-Schmidt process). That is, we can construct a set $$ v_1 = \frac{y}{\|y\|},v_2,\dots,v_n $$ such that $$ \langle v_i,v_j \rangle = \begin{cases} 1 & i=j\\ 0 & i \neq j \end{cases} $$ So, since this set is a basis, we can find coefficients $a_1,\dots,a_n$ so that $$ x = \overbrace{a_1 v_1}^{x^{||}} + \overbrace{a_2 v_2 + \cdots + a_n v_n}^{x^\perp} $$