Is the empty set linearly independent or linearly dependent?

16.4k Views Asked by At

Is the empty set linearly independent or dependent?

6

There are 6 best solutions below

34
On

By definition, it is linearly independent, because it is not linearly dependent.

A set $S$ is linearly dependent if there exists a finite set of vectors $v_1,\dots, v_n$ and corresponding scalars $\alpha_1,\dots,\alpha_n$ such that there exists at least one $\alpha_i\neq0$ so that $$\sum_{i=1}^n \alpha_i v_i=0$$


Remark: (equivalently, we could demand that all $\alpha_i$ are nonzero, but then we would also need to demand that there exists at least one $\alpha_i$ that is nonzero. This is because the empty set satisfies the demand that every element from it is nonzero...)


Clearly, these is no finite collections of vectors from $\{\}$ that satisfies the above condition, because there is no collection at all.


Furthermore, the empty set is also a basis of the vector subspace $\{0\}$, because $\{0\}$ is the

Smallest vector space that includes $\{\}$ and is a vector space.


In a way, you could also round-about your reasoning that $\{\}$ is linearly independent like this:

  • You know that $\{0\}$ is a vector space.
  • You know that every vector space has a basis.
  • You know that the basis of $\{0\}$ is a subset of $\{0\}$, so the basis of $\{0\}$ can either be $\{\}$ or $\{0\}$
  • You know that $\{0\}$ is not a basis because it is not linearly independent (because $1\cdot 0=0$)
  • Therefore, $\{\}$ is a basis.
  • Because all bases are linearly independent, so is $\{\}$

Note, this isn't really a "good" proof because it makes a sort of begging the question fallacy. This wasn't meant to be a proof in a mathematical sense, just a proof "to yourself" that you already know the empty set is linearly independent, because that's the only way every vector space can have a basis, and you know that that is true (and, in fact, you or someone else must have used the fact that the empty set is linearly independent while proving that fact)

0
On

It is linearly independent.
If a set is linearly dependent, then there would be a nontrivial linear combination of the vectors in the family that added up to the zero vector. It is also impossible to choose a vector in the empty set and write it as a linear combination of the other vectors in the empty set, since the empty family is empty.

2
On

Not only is it linearly independent, but it is also the basis for $\{\textbf{0}\}$.

3
On

Since the correct definition of "linearly dependent" has not been spelled out in detail in any of the answers so far, let me add a new answer. A subset $S$ of a vector space $V$ is defined to be linearly dependent if there exist finitely many distinct elements $s_1,s_2,\dots,s_n\in S$ and scalars $c_1,c_2,\dots,c_n$ which are not all $0$ such that $$\sum_{i=1}^n c_is_i=0.$$

(If $S$ itself is finite, you can just state this condition with $s_1,\dots,s_n$ being all the elements of $S$, since given any relation of this form you can just make the coefficients be $0$ for all the elements of $S$ you have not used.)

When $S=\emptyset$, the only possible collection of finitely many distinct elements of $S$ is the empty collection, with $n=0$. But there does not exist any collection of $0$ scalars, not all of which are $0$. After all, if you have a collection of no scalars, then vacuously all the scalars in your collection are $0$.

Thus $\emptyset$ is linearly independent (as a subset of any vector space).

0
On

The explainations offered above focus largely on showing the empty set is NOT linearly dependent and therefore must be linearly independent. I find this somewhat unsatisfying, so I attempt a more direct proof that the empty set is linearly independent...

Recall that any subset $V = \{ v_1, v_2, ..., v_n \}$ of some vector space $\mathbb{V}$ is a linearly independent set iff the trivial solution is the $only$ solution to $\sum_{j=1}^n c_j v_j = \bf 0$ where $n \in \mathbb{Z}_{\ge 0}$; $\bf 0$ is the zero vector; and $c_j$ is a scalar for $j=1,2,...,n$. To be more precise, $V$ is a linearly independent set iff the above equation is satisfied by $exactly$ $one$ set of $n$ scalars $S = \{ c_1, c_2,..., c_n \}$ such that, for every $i\in I = \{ 1,2,...,n \}$, we have $c_i = 0$ where $I$ is an index set. Thus, in order to prove some set $V$ is a linearly independent set, we must prove the following two conditional statements:

1st conditional statement:

$ \forall i \Big [ i \in I \to c_i = 0 \Big ] \Rightarrow \sum_{j=1}^n c_j v_j = \bf 0 $

2nd conditional statement:

$ \sum_{j=1}^n c_j v_j = {\bf 0} \Rightarrow \forall i \Big [ i \in I \to c_i = 0 \Big ] $

Now we prove the empty set $\emptyset$ consisting of $n=0$ vectors is a linearly independent set. Of course, since $n=0$, the sequence of $n$ vectors $v_1, v_2, ..., v_n$ in $V$ reduces to an empty sequence, implying $V=\emptyset$. The same is true of the sequence of $n$ scalars in $S$ as well as the sequence of $n$ indices in the index set $I$, implying $S=\emptyset$ and $I=\emptyset$.

We prove the 1st conditional statement as follows: by substitution, we may rewrite the antecedant as $\forall i \Big [ i \in \emptyset \to c_i = 0 \Big ]$. Note that $i\in \emptyset$ is always false because the empty set has no members. Hence, the antecedant is a vacuously true statement. With respect to the consequent, by substitution we may rewrite it as $\sum_{j=1}^0 c_j v_j = \bf 0$. Note that $\sum_{j=1}^0 c_j v_j$ is an empty (or nullary) sum equal to the additive identity element of the vector space, which in this case is the zero vector $\bf 0$. Hence, the consequent reduces to ${\bf 0} = \bf 0$, which is of course true. Since the antecedant and consequent are true, the 1st conditional statement must be true.

By the same reasoning, we can show the 2nd conditional statement is true. In this case, the antecedant reduces to ${\bf 0} = \bf 0$, which is true, and the consequent is vacuously true.

Since the 1st and 2nd conditional statements are true, we have shown that $V=\emptyset$ is a linearly independent set.

0
On

Belatedly, there might be some silly-but-not-worthless point to be made here, about mathematical conventions. Certainly we grant that everyone here knows full well what an empty set of vectors is. :)

A slightly interesting question is about what we_intend (not so much "the definition"...) by "linearly independent set of vectors". For example, operationally (as opposed to definitionally), for example, we'd mean that the set cannot be shrunk without shrinking the span. Or, equivalently, "for all vectors $v$ in the set, $v$ is not expressible as a linear combination of any of the other vectors in the set". Yes, there are other linguistic variants on the definition.

But, if we choose the above-mentioned course: since the span of $\phi$ is $\{0\}$ (oop, we need to specify what the span of the empty set is... in a given vector space?!?!?!) ... this subspace cannot be shrunk, so the vectors "spanning" it are linearly independent.

And/or, the "for all $v$ in the set ..." condition is vacuous for the empty set of vectors.

So, although it has essentially no mathematical content, there are indeed arguments in favor of declaring that the empty set of vectors is linearly independent. :)