The proof of: If $V$ is generated by a finite set $S$, then some subset of $S$ is a basis for $V$.

2.4k Views Asked by At

In my current understanding, a vector space should be provided before talking about the span of one of its subset. (Correct me if this is wrong...)

I'm reading the proof below in my book, but I'm not sure whether it goes like this:

I have to first assume that "$V$ is generated by $S$", now since $V$ been given, $S$ is a subset of $V$. Then since $\beta=\{u_1,u_2,\dots,u_k\}$ picked out from $S$, so it's also a subset of $V$. So now I can say $\textrm{span}(\beta)\subseteq V.$ And now combine it with the last part of the proof then $V=\textrm{span}(\beta).$

Is this thinking process correct?

The proof of: If $V$ is generated by a finite set $S$, then some subset of $S$ is a basis for $V$.

3

There are 3 best solutions below

2
On BEST ANSWER

The key fact is that if $\beta=\{u_1,\dots,u_k\}$ is linearly independent and $v\notin\operatorname{span}(\beta)$, then also $$ \beta'=\{u_1,\dots,u_k,v\} $$ is also linearly independent (proof below).

The procedure of adding new elements to $\{u_1\}$ stops when all remaining vectors are in the span of the found ones, forming the set $\beta$.

Since all vectors in $S$ are now in the span of $\beta$, you conclude that $S\subseteq\operatorname{span}(\beta)$ and so $\operatorname{span}(S)\subseteq\operatorname{span}(\beta)$. On the other hand, from $\beta\subseteq S$, you can conclude that $\operatorname{span}(\beta)\subseteq\operatorname{span}(S)$.

Hence the two spans are the same.


Proof of the claim. Let $a_1u_1+a_2u_2+\dots+a_ku_k+bv=0$. If $b\ne0$, then $$ v=(-b^{-1})a_1u_1+(-b^{-1})a_2u_2+\dots+(-b^{-1})a_ku_k $$ which is a contradiction to $v\notin\operatorname{span}(\beta)$. Therefore $b=0$ and then $a_1u_1+a_2u_2+\dots+a_ku_k=0$; by linear independence of $\beta$, also $a_1=a_2=\dots=a_k=0$.

2
On

The core of the proof is the following fact about vector spaces.

Let $S$ be a finite set of a vector space $V$. There exists $\beta \subset S$ such that $\beta$ is linearly independent and $span(\beta) = span(S)$.

This implies your theorem immediately, since $V$ generated by $S$ means precisely $V = span(S) = span(\beta)$, therefore $\beta$ is a linearly independent set that spans $V$, i.e. is a basis for $V$.

As for proving this fact, constructing $\beta$ is easy. The proof you show gives a 'bottom-up' approach, but personally I think it's even clearer to go 'top-down.'

Suppose $S = \{s_1, \ldots, s_n \}$. If $S$ is a linearly independent set, then we are done already, taking $\beta = S$. Else $S$ is not a linearly independent set, so there must be a linear dependence $\sum_{i=1}^n c_i s_i = 0$ where some of the $c_i$ are nonzero. Assume WLOG $c_n \not= 0$. Set $S' = \{s_1, \ldots, s_{n-1} \}$. It should be clear that $span(S) = span(S')$. We can continute this process of removing elements without changing the span of the set, and eventually we are guaranteed to reach a linearly independent subset of $S$ because $\{s_i \}$ is trivially a linearly independent set.

0
On

The source confusion, at least for me, was not carefully understanding theorem 1.5 of the same textbook. Here's an articulated break down of the same proof.

Theorem 1.5: "The span of any subset S of a vector space V is a subspace of V. Moreover, any subspace of V that contains S must also contains the span of S".

Explanation of 1.5:

  • Example. Let S be a subset, then span(S) is subspace.
  • From the second part of the previous theorem [that's any subspace of V that contains S must also contains the span of S], if the subspace span(S) contains Q then it contains span(Q).

Proof of theorem 1.9:

  • Construct ß by adding vectors from S incrementally such that ß is linearly independent.
  • From thm 1.5., since ß ⊆ S and S is a subset of the vector space V [which concludes that span(S) is a subspace], then span(ß) ⊆ span(S).
  • Let v ∈ S. if v ∈ ß, then clearly v ∈ span(ß). If v ∈/ ß then ß ∪ {v} is linearly dependent since we constructed ß such that it's linearly independent. From thm 1.7, v ∈ span(ß) and since v is an arbitrary vector then S ⊆ span(ß). We apply thm 1.5 again to yield span(S) ⊆ span(ß).
  • Since span(ß)⊆span(S) and span(S)⊆span(ß), then span(ß)=span(S)
  • We have span(S) = V, therefore span(ß)=V.
  • Therefore ß forms a basis for V.