Write a vector as a linear combination of orthonormal set vectors

4.4k Views Asked by At

How can I determine whether a vector can be expressed as a linear combination of a orthonormal set vectors ?

Let's say I have a orthonormal set of vectors $\{v_1, v_2\}$: $$ v_1=\left(-\frac{1}{2},-\frac{1}{2},-\frac{1}{2},\frac{1}{2}\right) $$ $$ v_2=\left(\frac{1}{2\sqrt{19}},\:\frac{5}{2\sqrt{19}},\:\frac{1}{2\sqrt{19}},\:\frac{7}{2\sqrt{19}}\right) $$

$v_1$ and $v_2$ are orthonormal set since $v_1 \cdot v_2 = 0$, and $\|v_1\| = 1, \|v_2\| = 1$.

and I have a vector: $$(1, 1, 1, 1)$$

I know how to express it as linear combination of $v_1$ and $v_2$, but how can I know it can be expressed as a linear combination of $v_1$ and $v_2$ ? Is that enough to just verify the orthonormal set vectors by using the following formula ? $$ x = (x\cdot v_1)v_1+(x\cdot v_2)v_2 $$

Thank you.

2

There are 2 best solutions below

0
On BEST ANSWER

One way is to check the rank of the following matrix: $$A = \begin{bmatrix} -\frac 12 & -\frac 12 & -\frac 12 & \frac 12\\ \frac{1}{2\sqrt{19}} & \frac{5}{2\sqrt{19}} & \frac{1}{2\sqrt{19}} & \frac{7}{2\sqrt{19}} \\ 1 & 1 & 1 & 1 \end{bmatrix} $$

If rank $A = 3$ - which is the case here - then all these $3$ vectors will be linearly indepedent, thus the vector $[1,1,1,1]$ cannot be written as a linear combination of $v_1$ and $v_2$. If rank $A = 2$, then the 3 vectors would have been linearly dependent.

1
On

Let $E$ be the vector space generated by $v_1$ and $v_2$. The orthogonal projection of a vector $x$ if precisely the vector $x':=(x \cdot v_1)v_1 + (x \cdot v_2)v_2$ you wrote. I claim that $x$ is a linear combination of $v_1$ and $v_2$ if and only if it belongs to $E$, that is if and only if $x=x'$.

This can be prooved by a simple argument of linear algebra. As $(v_1,v_2)$ is an orthonormal set, it can be completed to an orthonormal basis $(v_1,v_2,v_3,v_4)$ of the ambient space (which I guess to be $\mathbf{R}^4$). Since it is a basis, there are unique scalars $\lambda_1,\ldots,\lambda_4$ in $\mathbf{R}$ such that $x = \lambda_1 v_1 + \cdots + \lambda_4 v_4$. Then you have $x \cdot v_i = \lambda_i$ because of the orthonormality assumption. The claim follows directly.