Linear independence

43 Views Asked by At

In my mind there is a conflict between the "intuitive" definition of linear dependence of vectors i.e., that :

$\vec{v_1}=k\vec{v_2}$

and the formal definition that says that there must be at least, i.e. 1 is enough, scalar that is non-zero in the linear combination:

$\alpha_1\vec{v_1}+\alpha_2\vec{v_2}+...+\alpha_n\vec{v_n} = \vec{0}$

for me that implies that the case where ALL the OTHER $\alpha_i$ are = 0 must be taken into account*, i.e. that the equation is:

$0\vec{v_1}+0\vec{v_2}+ \alpha_i\vec{v_i}...+0\vec{v_n} = \vec{0} $

and therefore: $\alpha_i\vec{v_i} = \vec{0}$

=> $\vec{v_i} = \vec{0} $ which is useless ...? That is, for me, the definition should say that there are at least 2 non zero $\alpha_i$... but clearly is it only 1 that is needed.

what is wrong in my reasoning? (I know of course that the definitions are correct)

  • of course there are the cases where more than 1 of the scalars is non-zero in which case there is no problem, but my problem is with the at least one.
1

There are 1 best solutions below

9
On BEST ANSWER

It turns out that, unless one of the $v_i$'s is the null vector, it never happens that only one of the $\alpha_i$'s is $0$. In other words, if $v_1v_2,\ldots,v_n\neq0$ and if $v_1v_2,\ldots,v_n$ are linearly dependent, then there coeffiecients $\alpha_1,\alpha_2,\ldots,\alpha_n$ of which at least two are non-zero such that$$\alpha_1v_1+\alpha_2v_2+\cdots+\alpha_nv_n=0.$$

On the other hand, asserting that if at least one of the $\alpha_i$'s is non-zero then all others are zero is a non sequitur.