Decomposition of Vector through Dot Product with Basis Vector: Where does Necessity of Normalization Follow from?

1000 Views Asked by At

If we have an orthonormal set $E$ of vectors $e_1$ to $e_n$ we can decompose any given vector $v$ from the same space as $E$ by dotting it with the $e_i$. Proof:

As $v$ is in the space of $E$ it must be the case that $v = \sum_{i=1}^n \alpha_i e_i$. If we wish to get the $k$th component of $v$ we can thus do

$$e_k \bullet v = e_k \sum_{i=1}^n \alpha_i e_i = \underbrace{\sum_{i=1}^n e_k \alpha_i e_i = a_k e_k \bullet e_k}_\text{as $e_k \bullet e_i = 0$ for all $i \neq k$, as all $e_i$ are orthogonal.} = \alpha_k$$

If $E$ is not orthonormal, but merely orthogonal, all $e_i$ need to be normalised, so we end up with

$\alpha_k = \frac{e_k \bullet v}{e_k \bullet e_k}$ for the $k$th component of $v$.

At least that is what I was told. I don't see however, where the proof ever uses the fact that all $e_i$ are unit length. It works just as well if the $e_i$ are of different length. It still holds that $e_k \bullet e_k = 1$ and that $e_k \bullet e_i = 0 ~ \forall ~i \neq k$. Therefore the $k$th component of $v$ should still simply be $v \bullet e_k$, according to the logic of the proof.

However, I also realise that in general $v \bullet e_k \neq v \bullet c e_k$ for some factor $c$. So... is there a hole in the proof or am I missing the point where the property of the $e_i$ being unit length is being used?

2

There are 2 best solutions below

4
On BEST ANSWER

The fact that $a_k e_k\cdot e_k=a_k$ is because $|e_k|^2=1$. If $e_k$'s are not orthonormal but only orthogonal then:

\begin{align} a_k e_k\cdot e_k=a_k\cdot|e_k|^2\\ \implies a_k=\frac{a_k e_k\cdot e_k}{|e_k\cdot e_k|}\\ \implies a_k=\frac{e_k\cdot v}{|e_k\cdot e_k|} \end{align}

0
On

"It still holds that $e_k \bullet e_k = 1$"

Under your assumptions (the $e_k$ are still orthogonal, but not necessarily unit length), this is in general false. You might want to look at this example:

$$ e_1 = \pmatrix{2 \\ 0}\\ e_2 = \pmatrix{0 \\ 2} $$ in which $e_k \cdot e_k = 4$ for $k = 1, 2$.

If you try to decompose the vector $$ v = \pmatrix{2\\2} = e_1 + e_2 $$ using your proposed formula, you'll see what goes wrong.