What is the meaning of the The length of all linearly independent list is less or equal to the length of all spanning lists?

392 Views Asked by At

I am currently reading Linear Algebra Done Right (3e), by Sheldon Axler. In the book, there is a theorem (2.23) that states "The length of linearly independent lists are less than or equal to the length of every spanning list of vectors.

Does this simply mean that the length of any linearly independent list belonging to $F^{n}$ can be no more than $n$?

The theorem is a very plain and straightforward sentence, however I'm trying to convince myself of this and thoroughly understand why this is the case.Can someone please explain this to me in other words?

3

There are 3 best solutions below

0
On

Suppose $u_1, \dotsc, u_m$ is linearly independent in $V$ and $B = w_1, \dotsc, w_n$ spans $V$. We are proving that $m \leq n$.

If we add $u_1$ to the beginning of $B$, $B$ is linearly dependent since it already spans $V$. $u_1 \neq 0$ due to linear independence of the $u$'s, so we can apply the linear dependence lemma (2.21, p.34) to remove a vector from $B$. The removed vector will always be among the $w$'s because $u_1, \dotsc, u_j$ is linearly independent, and thus no $u_j$ satisfies condition (a) of the linear dependence lemma. We continue this process for each $u_j$, placing it after $u_1, \dotsc, u_{j-1}$.

After step $m$, all the $u$'s have been added. Since a single vector is added and removed at each step, the length of the combined list is $n + 1$ at the end of each iteration. If at any step we added a $u$ and there were no $w$'s to remove, then we would have a contradiction. Thus $m \leq n$.

If $m = n$, then $u_1, \dotsc, u_m$ and $w_1, \dotsc, w_n$ are both bases of $V$.

0
On

It means that if $V$ is a vector-space over a field $k$ and $A\subseteq V$ is a set of linearly independent vectors over $k$ and $B\subseteq V$ is a set with $span (B)=V$, i.e. a generating set, then $$|A|\leq |B|$$ i.e. $A$ has less or equally many elements than the generating set $B$. That means : No a set of linearly independent vectors in $F^n$ cannot have more than $n$ elements since a basis is a generating (or spanning) set that has just $n$ elements. This is clear from the fact that if a vector $v$ is linearly independent of all vectors in a set $A$ then $v\notin span (A)$

0
On

While formal answers have been given, I want to give a hint to understand mathematics intuitively.

To understand the theorem (or any object/property) intuitively, it makes sense to think of concrete examples. Take for instance $\mathbb{R}^2$ or $\mathbb{R}^3$ (or better: both!). You should have some geometric understanding of these spaces. Make sure to see that the theorem is true in these cases.

It generalizes naturally and gives rise to the notion of a basis and the dimension.