Question about proof from Friedberg's Linear Algebra

269 Views Asked by At

Theorem 1.7 Let $S$ be a linearly independent subset of a vector space $V$, and let $v$ be a vector in $V$ that is not in $S$. Then $S \cup \{v\}$ is linearly dependent if and only if $v\in$ span($S$).

The following is the first part of the proof:

Proof . If $S \cup \{v\}$ is linearly dependent, then there are vectors $u_1,u_2,...,u_n$ in $S \cup \{v\}$ such that $a_1u_1 + a_2u_2 + \cdots + a_nu_n = 0$ for some nonzero scalars $a_1,a_2, \cdots , a_n$. Because $S$ is linearly independent, one of the $u_{i}$'s, say $u_1$, equals $v$. Thus $a_1v + a_2u_2 + \cdots + a_nu_n = 0$

$v = a_{1}^{-1}(-a_2u_2 - \cdots -a_nu_n) = -(a_{1}^{-1}a_2)u_2 - \cdots -(a_{1}^{-1}a_n)u_n.$

Since $v$ is a linear combination of $u_2, \cdots , u_n$, which are in $S$, we have $v \in$ span(S).

I am confused by the statement, "because $S$ is linearly independent, one of the $u_{i}$'s, say $u_1$, equals $v$." Why exactly does it follow from $S$ being linearly independent that one of the $u_{i}$'s is equal to $v$?

2

There are 2 best solutions below

0
On BEST ANSWER

The zero vector can be expressed as a non-trivial combination of vectors from $S\cup \{u\}$ (by definition, since this set is lin. dip.). This gives

\begin{equation}0=a_1u_1+a_2u_2+\dots +a_nu_n \end{equation} and not all $a_i$ zero. This is true only if one of the $u_i$s is $v$ because the above equation has only trivial solution (all $a_i$s zero) if considered for vectors from $S$ only (S being linearly independent).

Hence without loss of generality (relabelling the vectors if your prefer) we can set/claim $u_1=v$.

0
On

$$u_1,u_2,...,u_n \in S \cup \{v\}$$ and $$a_1u_1 + a_2u_2 + \cdots + a_nu_n = 0$$

along with linear independence of $S$ implies that either all coefficients are $0$ or one of $u_i$ is $v$. Since the coefficients are not all $0$, one of $u_i$ is $v$