Is the zero vector in the definition of linear dependence arbritary?

5k Views Asked by At

The definition of linear dependence according to wikipedia is

The vectors in a subset $S=(v1,v2,...,vk)$ of a vector space $V$ are said to be linearly dependent, if there exist a finite number of distinct vectors $v1, v2, ..., vn$ in $S$ and scalars $a_1$, $a_2$, ..., $a_n$, not all zero, such that $ a_1 v_1 + a_2 v_2 + \cdots + a_k v_k = 0, $ where zero denotes the zero vector.

I was wondering if the zero vector in the definition of linear dependence is arbritary?

Thanks, Jackson

3

There are 3 best solutions below

8
On BEST ANSWER

Well, if it were, then we would have a very curious situation. Try replacing $0$ with some fixed vector $v_0 \ne 0$. Then the set $\{ 0 \}$ is independent, but the set $\{ v_0 \}$ isn't!

To make things worse: if you had two vectors $a$ and $b$ such that $\{ a, b, v_0 \}$ was independent in the standard sense, then the set $\{ \lambda a + \mu b \mid \lambda, \mu \in \mathbb{R} \}$ is independent, despite being a whole subspace!

EDIT: Maybe a specific example will help. Say we define "independent" to mean "there is a linear combination that sums to $\langle 1,1,1 \rangle$. Then the set $\{ \langle 1, 0, 0 \rangle, \langle 0, 1, 0 \rangle, \langle 1, 1, 0 \rangle \}$ is independent (no matter what combination you take, the $z$-component is zero, not one, so you can never get $ \langle 1, 1, 1 \rangle $). But this is clearly silly, because one vector is the sum of the other two, and so whatever our definition is describing, it doesn't capture the notion of independence.

4
On

A set of vectors are called linearly dependent if at least one of them can be expressed as a linear combination of the others. Moving them all to one side we get that linear dependence is equivalent to the existence of a linear combination with one coefficient being $1$ that is the same as the zero vector. Since vector spaces are over fields, we can drop the requirement of one of the coefficients being $1$ and just require the linear combination to be non-zero (not all zero).

0
On

As I understand; it is not necessary that the vector on the right hand side be zero. The only necessary condition is that there exist a non-trivial representation of the the vector as a linear combination of the other vectors.

Trivial Solution

a1v1+a2v2+.........+anvn=0 only if a1=a2=a3=......=an

Alternatively; there exist atleast one non-zero scalars (a1,a2,....,an) for the set to be linearly dependent. So, the RHS can be non-zero.