Linearly Independent and a simple case

18 Views Asked by At

Definition. Let $x_{1},x_{2},...,x_{n}$ be vectors. We ssay that $x_{1},x_{2},...,x_{n}$ are linearly independent if for all scalars $\alpha _{1},\alpha _{2},...,\alpha _{n}$

$\alpha _{1}x_{1}+\alpha _{2}x_{2}+...+\alpha _{n}x_{n}=0$

implies $\alpha_{1}=\alpha_{2}=...=\alpha_{n}$.

Simple case:

(the statement) For $n=1$, we should have $\alpha x=0$ implies $\alpha=0$. This is equivalent to $x\neq 0$. (WHY?)

My question is that why must $x\neq 0$ be?

So,how can I prove this statement?

1

There are 1 best solutions below

0
On BEST ANSWER

The set of vectors $\{0\}$ is not linearly independent.

The reason is simply that $1\times 0$ is $0$ while $1$ is not $0$.