The length of every linearly independent list of vectors is less than or equal to the length of every spanning list of vectors.

4.2k Views Asked by At

I am referring to Theorem 2.23 of the book is Linear Algebra Done Right by Axler. It mentions

Theorem: In a finite-dimensional vector space, the length of every linearly independent list of vectors is less than or equal to the length of every spanning list of vectors.

Proof

Suppose $u_1, u_2,.....,u_m$ is linearly independent in V. Suppose also that $w_1,w_2,...,w_n$ spans V. We need to prove that $m \leq n$. We do so through the multi-step process described below; note that in each step we add one of the $u$’s and remove one of the $w$’s.

Step 1

Let B be the list $w_1,w_2,...,w_n$, which spans V. Thus adjoining any vector in V to this list produces a linearly dependent list (because the newly adjoined vector can be written as a linear combination of the other vectors). In particular, the list $u_1,w_1,...,w_n$ is linearly dependent. Thus by the Linear Dependence Lemma (2.21), we can remove one of the $w$’s so that the new list B (of length $n$) consisting of $u_1$ and the remaining $w$’s spans V.

Step j

The list B (of length $n$) from step $j-1$ spans V. Thus adjoining any vector to this list produces a linearly dependent list. In particular, the list of length $n+1$ obtained by adjoining $u_j$ to B, placing it just after $u_1,u_2,...,u_{j-1}$, is linearly dependent. By the Linear Dependence Lemma (2.21), one of the vectors in this list is in the span of the previous ones, and because $u_1,u_2,...,u_j$ is linearly independent, this vector is one of the $w$’s, not one of the $u$’s. We can remove that $w$ from B so that the new list B (of length $n$) consisting of $u_1,u_2,...,u_j$ and the remaining $w$’s spans V.

I have problem with the part that states

By the Linear Dependence Lemma (2.21), one of the vectors in this list is in the span of the previous ones, and because $u_1,u_2,...,u_j$ is linearly independent, this vector is one of the $w$’s, not one of the $u$’s.

Why does linear independence of the list $u_1,u_2,...,u_j$ imply that one of $u$'s cannot be written as a linear combination of the rest of $u$'s and $w$'s in the list?

What I can understand is that if the author said it must be possible to select one of $w$'s as otherwise, the $u$'s will end up being linearly dependent, then he'd be right. If you cannot choose any of the $w$'s and the list is known to be linearly dependent, then one of the $u$'s will end up being in the span of the rest of the $u$'s. This is not what he states though. He states it has to be one of $w$'s. I think that statement is wrong.

If I am making a mistake in the way I have understood the proof, please help me understand it correctly.

2

There are 2 best solutions below

1
On BEST ANSWER

By the Linear Dependence Lemma (2.21), one of the vectors in this list is in the span of the previous ones

(emphasis added)

The author is not commenting on whether one of the $u$s can be written as a linear combination of the $w$s which follow it in the list. The Linear Dependence Lemma gives that one of the vectors in this list is the span of the ones preceding it in the list, i.e., one of the following things is true:

  • $u_2$ is in the span of $\{u_1\}$
  • $u_3$ is in the span of $\{u_1,u_2\}$
  • ...
  • $u_j$ is in the span of $\{u_1,\dots,u_{j-1}\}$
  • $w_1$ is in the span of $\{u_1,\dots,u_j\}$
  • $w_2$ is in the span of $\{u_1,\dots,u_j,w_1\}$
  • ...

Since the $u$s are linearly independent, it is impossible for any $u$ to be in the span of previous vectors on the list. So it must be a $w$ which is in the span of previous vectors on the list.

0
On

At step j we have the list of n vectors $$u_1,\ldots,u_j,w_1,\ldots,w_{n-j}$$ which span $V$. Then we add $u_{j+1}$ to the list and get $$u_1,\ldots,u_{j+1},w_1,\ldots,w_{n-j}$$ Because $u_{j+1}$ is in the span of the previous vectors, this new list is linearly dependent (bullet point 2.20). So, there must be $n+1$ scalars $a_1,\ldots, a_{n+1}$ not all equal to $0$ such that $$a_1 u_1 + \cdots + a_{j+1} u_{j+1} + a_{j+2} w_1 + \cdots + a_{n+1} w_{n-j} = 0$$ Now, to answer the question of why we can always choose one of the $w$'s as a linear combination of the rest, notice that if all of the scalars from $a_{j+2}$ to $a_{n-j}$ were equal to $0$, then all scalars from $a_1$ to $a_{j+1}$ should be $0$ as well, because all the $u$'s are linearly independent. But we said that not all $a$'s were equal to $0$.