Definition. Let $x_{1},x_{2},...,x_{n}$ be vectors. We ssay that $x_{1},x_{2},...,x_{n}$ are linearly independent if for all scalars $\alpha _{1},\alpha _{2},...,\alpha _{n}$
$\alpha _{1}x_{1}+\alpha _{2}x_{2}+...+\alpha _{n}x_{n}=0$
implies $\alpha_{1}=\alpha_{2}=...=\alpha_{n}$.
Simple case:
(the statement) For $n=1$, we should have $\alpha x=0$ implies $\alpha=0$. This is equivalent to $x\neq 0$. (WHY?)
My question is that why must $x\neq 0$ be?
So,how can I prove this statement?
The set of vectors $\{0\}$ is not linearly independent.
The reason is simply that $1\times 0$ is $0$ while $1$ is not $0$.