In this proof author tries to prove $m \leq n$ but then assumes he can do this process m times. How can we be sure about we don't run out of n?
If this is not the case, why this proof is too long? Can't we just use the Linear Dependence Lemma to remove one span vector and add one linearly independent vector until we run out of linearly independent vectors?

2026-02-23 08:29:33.1771835373
On
Linear Algebra Done Right confusing proof
104 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Suppose at step $j$ the new set of vectors obtained is denoted $B_j$,$\newcommand{\spn}{\text{span}}$ and $B_0$ is the set that contains all the $w$'s. The main idea behind this argument is that $\spn(B_j)=\spn(B_{j+1})$ for any $j$. This is guaranteed by the part (b) in the quoted Linear Dependence Lemma. Since $u_{j+1}\in\spn(B_0)=\spn(B_j)$, we are guaranteed that $\{u_{j+1}\}\cup B_{j}$ is linearly dependent.
Now, if the $n$ vectors in $B_0$ are used up but we still have some $u$'s not used, then $B_j$ contains full of $u$'s, so $\{u_{j+1}\}\cup B_j$ would be linearly independent. This is impossible.

Let's assume that $m > n$. In this case, after step $n$, we are left with the list of vectors $u_1, \ldots, u_n$, which by the argument in the proof must span $V$. Consider the vector $u_{n+1}$, which exists since $m > n$. On the one hand, the list $u_1, \ldots, u_{n+1}$ is assumed to be linearly independent. On the other hand, $u_{n+1} \in V = \operatorname{span}(u_1, \ldots, u_n)$, so the list $u_1, \ldots, u_{n+1}$ is linearly dependent. Contradiction.