I am trying to prove that the set $S = \{v\}$ where $v\neq \mathbf{0}$ and belongs to a vector space $W$ is linearly independent. My textbook makes this statement without proof. It seems obvious to me but I get the feeling there may be something more to it than I realize.
Here is my stab at the proof:
We can write $x_{1}v =\mathbf{0}$ (where $x_{1}$ is some scalar) and since we know $v \neq \mathbf{0}$ then $x_{1}$ has to be $0$. In that case, we have only the trivial solution so by definition $S = \{v\}$ is linearly independent.
Is there something more rigorous to the proof of this statement or is it just so trivial that it requires no more than a little bit of explanation based on definitions?
That's all there is to it. A set of finitely many vectors, say $n$, has at most $n$ degrees of freedom in $\sum_ix_iv_i$, so a proof of linear dependence is all about showing none survive when we impose $\sum_ix_iv_i=0$. But since there was only $1$ to begin with, this condition reduces that to $0$.