I am currently reading Linear Algebra Done Right (3e), by Sheldon Axler. In the book, there is a theorem (2.23) that states "The length of linearly independent lists are less than or equal to the length of every spanning list of vectors.
Does this simply mean that the length of any linearly independent list belonging to $F^{n}$ can be no more than $n$?
The theorem is a very plain and straightforward sentence, however I'm trying to convince myself of this and thoroughly understand why this is the case.Can someone please explain this to me in other words?
Suppose $u_1, \dotsc, u_m$ is linearly independent in $V$ and $B = w_1, \dotsc, w_n$ spans $V$. We are proving that $m \leq n$.
If we add $u_1$ to the beginning of $B$, $B$ is linearly dependent since it already spans $V$. $u_1 \neq 0$ due to linear independence of the $u$'s, so we can apply the linear dependence lemma (2.21, p.34) to remove a vector from $B$. The removed vector will always be among the $w$'s because $u_1, \dotsc, u_j$ is linearly independent, and thus no $u_j$ satisfies condition (a) of the linear dependence lemma. We continue this process for each $u_j$, placing it after $u_1, \dotsc, u_{j-1}$.
After step $m$, all the $u$'s have been added. Since a single vector is added and removed at each step, the length of the combined list is $n + 1$ at the end of each iteration. If at any step we added a $u$ and there were no $w$'s to remove, then we would have a contradiction. Thus $m \leq n$.
If $m = n$, then $u_1, \dotsc, u_m$ and $w_1, \dotsc, w_n$ are both bases of $V$.