Multipliciousness within an inner product space.

862 Views Asked by At

Question:

Let $V$ be an inner product space and $v,w\in V$. Prove that $\lvert\langle v,w\rangle\rvert=\lVert v\rVert \lVert w\rVert$ if and only if one of the vectors $v$ or $w$ is a multiple of the other.


Attempt:

Assume the identity holds and $y\neq 0$. Let $$ a=\frac{\langle x,y\rangle }{\lVert y \rVert^2}, $$ and let $$ z=x-ay. $$ Now, note that $y$ and $z$ are orthogonal because if $z=x-ay$, then by the definition of $a$ we have $$a=\frac{\langle x,y\rangle }{\lVert y \rVert^2}=\frac{\langle z+ay, y\rangle }{\langle y,y\rangle }=\frac{\langle z,y\rangle }{\langle y,y\rangle }+a\frac{\langle y,y\rangle }{\langle y,y\rangle }=\frac{\langle z,y\rangle }{\langle y,y\rangle }+a,$$ which means that $$ \frac{\langle z,y\rangle }{\langle y,y\rangle }=0~\text{or}~ \langle z,y\rangle =0,$$ namely $y$ and $z$ are orthogonal. Furthermore, since $\lvert \langle x, y \rangle \rvert= \lVert x \rVert \lVert y \rVert$, then $$ \frac{\lvert \langle x, y \rangle \rvert}{\lVert y \rVert} =\lVert x \rVert ~\overset{1/\lVert y \rVert}{\implies}~\frac{\lvert \langle x, y \rangle \rvert}{\lVert y \rVert^2} =\frac{\lVert x \rVert}{\lVert y \rVert} =\lvert a \rvert.$$

Lemma: Let $V$ be an inner product space, and suppose that $x$ and $y$ are orthogonal vectors in $V$. Then $\lVert x+y \rVert^2 = \lVert x \rVert^2 + \lVert y \rVert^2$.

Now, we know that $$ \lVert x \rVert^2 = \lVert ay+z \rVert^2,$$ but, by the lemma, this means that $$ \lVert ay+z \rVert^2 = \lVert ay \rVert^2 + \lVert z \rVert^2~~\overset{\text{not sure where to go}}{\dots}$$


Note:

I'm trying to follow along with Friedberg's description on pages 337 and 338 in his Linear Algebra:

enter image description here enter image description here


EDIT$^1$:

OK, so I think I've got an idea. To complete the proof I need to notice that $\frac{\lVert x \rVert}{\lVert y \rVert} =\lvert a \rvert$, which means that $\lVert x \rVert = \lvert a \rvert\lVert y \rVert$ and more importantly that $\lVert x \rVert^2 = \lvert a \rvert^2\lVert y \rVert^2,$ but this is exactly the $\lVert ay\rVert^2$ from the application of the lemma as $\lVert ay\rVert^2=(\lVert ay\rVert)^2=(\lvert a\rvert\lVert y\rVert)^2=\lvert a\rvert^2\lVert y\rVert^2$. This will give the result.


CONCLUSION:

Here is my end proof:

enter image description here


EDIT$^2$:

I found here in E. B. Vinberg's A Course in Linear Algebra a different flavor of this proof (if anybody happens to care):

enter image description here

3

There are 3 best solutions below

4
On BEST ANSWER

Notice that: $$ \begin{align*} \lVert x \rVert^2 &= \lVert ay+z \rVert^2 \\ \lVert x \rVert^2 &= \lVert ay \rVert^2 + \lVert z \rVert^2 \\ \lVert x \rVert^2 &= (|a|\lVert y \rVert)^2 + \lVert z \rVert^2 \\ \lVert x \rVert^2 &= \left( \dfrac{\lVert x\rVert}{\lVert y\rVert}\lVert y \rVert \right)^2 + \lVert z \rVert^2 \\ \lVert x \rVert^2 &= \lVert x \rVert^2 + \lVert z \rVert^2 \\ 0 &= \lVert z \rVert^2 \\ \lVert z \rVert &= 0 \\ z &= 0 \qquad \text{ (the zero vector)} \end{align*} $$

Thus, we have $0=z=x-ay \iff x=ay$ so that $x$ and $y$ are scalar multiples of each other, as desired.

It remains to prove the converse. Suppose $x=ay$ for some scalar $a$. Then we have: $$\lvert\langle x,y\rangle\rvert =\lvert\langle ay,y\rangle\rvert =\lvert a\langle y,y\rangle\rvert =\lvert a \left( \lVert y\rVert \lVert y\rVert \right) \rvert =\lvert a \rvert \left( \lVert y\rVert \lVert y\rVert \right) =\lVert ay\rVert \lVert y\rVert =\lVert x\rVert \lVert y\rVert$$ as desired.

1
On

You basically want to show that $$\left\langle\frac{v}{\|v\|},\frac{w}{\|w\|}\right\rangle=\frac{\langle v,w\rangle}{\|v\|\cdot\|w\|}=1$$ if and only if $v$ is a multiple of $w$, which will be the case if and only if $\frac{v}{\|v\|}=\frac{w}{\|w\|}$. Hence, let us assume without loss of generality that $\|w\|=\|v\|=1$ and prove that $\langle v,w\rangle = 1$ if and only if $w=v$. Indeed, "$\Leftarrow$" is easy since $v=w$ implies $\langle v,w\rangle = \langle v,v\rangle=\|v\|^2 = 1$. On the other hand, let $V=\mathrm{Span}(v) \perp U$ and write $w=\lambda v + u$. Then, $$1 = \langle v,w\rangle = \langle v, \lambda v + u\rangle = \lambda + \langle v,u\rangle = \lambda.$$ Therefore, $w=v+u$. Then, $$1 = \langle w,w\rangle = \langle v+u,v+u\rangle = \langle v,v\rangle + \langle u,u\rangle + 2\langle v,u\rangle = \langle v,v\rangle + \langle u,u\rangle = 1 + \|u\|$$ implies $\|u\|=0$, so $u=0$ and $v=w$.

2
On

What you did and showed in your lengthy post looks fine but rather messy and long. The slickest, simplest proof of this I know is:

Let $\,t\in\Bbb R\,$ be some parameter or, if you will, unknown:

$$0\le||\,x+t\langle x,y\rangle y\,||^2=\langle\;x+t\langle x,y\rangle y\,,\,x+t\langle x,y\rangle y\;\rangle=|\langle x,y\rangle|^2||y||^2\,t^2+2|\,\langle x,y\rangle\,|^2 t+||x||^2$$

Now the beauty: the rightmost expression is a quadratic in $\,t\,$ and since it is always non-negative its discriminant $\;\Delta\;$ is non-positive:

$$\Delta:=4\,|\,\langle x,y\rangle\,|^4-4||x||^2||y||^2|\langle x,y\rangle|^2\le 0\implies |\langle x,y\rangle|\le||x||\,||y||$$

and the above already proves the Cauchy-Schwarz-Buniakovski inequality, but we also get:

$$|\langle x,y\rangle |=||x||\,||y||\implies \color{red}{\left\|\,x-\frac{\langle x,y\rangle}{||y||^2}y\right\|}=||x||^2+2\frac{||x||^2\,||y||^2}{||y||^4}||y||^2-2\frac{||x||^2||y||^2}{||y||^2}=0\implies$$

$$\implies x=\frac{\langle x,y\rangle}{||y^2||}y\implies\;x\,,\,y\;\;\text{are linearly dependent}$$

Note: Where does the above red term comes from? From finding out what the unique root of the upper quadratic in $\,t\,$ is: it is $\;\displaystyle{t=-\frac1{||y||^2}}\;$ and substituting $\;\;\ldots\ldots$