I am working on the following problem:
Let $B_1=\{v_1,v_2,\ldots,v_k\}$ and $B_2=\{w_1,w_2,\ldots,w_{k+1}\}$ be two linearly independent sets of vectors from the same vector space $U$. Prove that there exists an index $i\in[k+1]$ so that the set $B_1\cup w_i$ of vectors is linearly independent.
I have tried two approaches and have gotten stuck with both.
Approach 1:
Since $B_2$ is a linearly independent set of vectors from the vector space $U$ with $|B_2|={k+1}$, it follows that if $B$ is a basis of $U$ then $|B|\geq k+1$. This means that $\text{dim}(U)\geq k+1$. Now suppose for contradiction that for all $i\in[k+1]$, $B_1\cup w_i$ is a linearly dependent set of vectors.
I was trying to conclude here that since $B_1\cup w_i$ is a linearly dependent set of vectors for all $i\in[k+1]$, it follows that any basis $B$ of $U$ will have $|B|\leq k$. But I think that only work if $B_1\cup w_i$ is a linearly dependent set of vectors for all $w_i\in U$.
Approach 2:
Since $B_1$ is a linearly independent set of vectors, it follows that if $$a_1\cdot v_1+a_2\cdot v_2+\ldots+a_k\cdot v_k=0$$ for some scalars $a_1,a_2,\ldots,a_k$ then $a_1=a_2=\ldots=a_k=0$. Similarly, since $B_2$ is a linearly independent set of vectors, it follows that if $$b_1\cdot w_1+b_2\cdot w_2+\ldots+b_{k+1}\cdot w_{k+1}=0$$ for some scalars $b_1,b_2,\ldots,b_{k+1}$ then $b_1=b_2=\ldots=b_{k+1}=0$. Now suppose for contradiction that for all $i\in[k+1]$, $B_1\cup w_i$ is linearly dependent. This means that there exists scalars $c_1,c_2,\ldots,c_{k+1}$ not all zero such that $$c_1\cdot v_1+c_2\cdot v_2+\ldots+c_k\cdot v_k +c_{k+1}\cdot w_i=0$$ Since the $c_i's$ are not all $0$ and $B_1$ is algebraically independent, $c_{k+1}\neq 0$.Then $$w_i=-\frac{c_1}{c_{k+1}}\cdot v_1-\ldots-\frac{c_k}{c_{k+1}}\cdot v_k$$ which means that $w_i\in\text{span}(B_1)$ for all $i\in[k+1]$. In other words, $B_2\subseteq\text{span}(B_1)$.
And now I am stuck on deriving a contradiction.
Can I get some guidance? I think I am overlooking something here.
We can reduce the statement to one about the dimension of $\mathbf{k}^{k}$, where $\mathbf{k}$ denotes the ground field.
I argue the contrapositive. Assume that each $B_{1} \cup w_{i}$ is linearly dependent, i.e., there exist scalars $c_{i,0},c_{i,1},\dots,c_{i,k} \in \mathbf{k}$ such that at least one is nonzero and $$ c_{i,0}w_{i} + \sum_{j = 1}^{k} c_{i,j} v_{j} = 0. $$ In fact, we know $c_{i,0}$ must be nonzero, for otherwise, we would have a nontrivial linear combination in $B_{1}$. Assume without loss of generality that $c_{i,0} = 1$. $$ w_{i} = \sum_{j = 1}^{k} (-c_{i,j}) v_{j}. $$ In other words, for each $1 \leq i \leq k+1$, we have a vector $c_{i}$ in $\mathbf{k}^{k}$ such that $$ w_{i} = -\begin{pmatrix} & & \\ v_{1} & \cdots & v_{k} \\ & & \end{pmatrix} c_{i}, $$ where the $j$th row of $c_{i}$ is defined to be $c_{i,j}$. Because $\mathbf{k}^{k}$ has dimension $k$, the $c_{i}$ are linearly dependent. Once more, this means there exist $\lambda_{1},\dots,\lambda_{k+1}$, not all zero, such that $$ \sum_{i=1}^{k+1} \lambda_{i}c_{i} = 0. $$ Multiply from the left by $$ -\begin{pmatrix} & & \\ v_{1} & \cdots & v_{k} \\ & & \end{pmatrix} $$ to get $$ \sum_{i=1}^{k+1} \lambda_{i}w_{i} = 0. $$
In conclusion, if the $w_{i}$ were on the other hand linearly independent, then it cannot be the case each $B_{1} \cup w_{i}$ is also linearly independent.
Let us examine a concrete example. Consider if $\mathbf{k} = \mathbf{R}$, $k = 2$, and $$ v_{1} = \begin{pmatrix} 1 \\ 0 \end{pmatrix}, \quad v_{2} = \begin{pmatrix} 1 \\ 1 \end{pmatrix} $$ $$ w_{1} = \begin{pmatrix} 1 \\ 4 \end{pmatrix}, \quad w_{2} = \begin{pmatrix} 2 \\ 5 \end{pmatrix}, \quad w_{3} = \begin{pmatrix} 3 \\ 6 \end{pmatrix}. $$ The first step in the proof is to express each $w_{i}$ as a linear combination in the $v_{i}$. Specifically, the vectors $c_{i}$ in the proof come out to be $$ c_{1} = \begin{pmatrix} 3 \\ -4 \end{pmatrix}, \quad c_{2} = \begin{pmatrix} 3 \\ -5 \end{pmatrix}, \quad c_{3} = \begin{pmatrix} 3 \\ -6 \end{pmatrix}. $$ Row reduction on the matrix $$ \begin{pmatrix} 3 & 3 & 3 \\ -4 & -5 & -6 \end{pmatrix} $$ yields $$ \begin{pmatrix} 1 & 0 & -1 \\ 0 & 1 & 2 \end{pmatrix} $$ from which we read off a nonzero vector in the kernel, namely $$ \begin{pmatrix} 1 \\ -2 \\ 1 \end{pmatrix}. $$ In other words, we can take $\lambda_{1} = 1, \lambda_{2} = -2, \lambda_{3} = 1$. Indeed, $$ \begin{pmatrix} 1 \\ 4 \end{pmatrix} - 2 \begin{pmatrix} 2 \\ 5 \end{pmatrix} + \begin{pmatrix} 3 \\ 6 \end{pmatrix} = 0. $$