I am preparing a lecture note for a primary course on Vector Spaces and I am developing basis and linearly independent sets. There I took the following path:
- Define the linear span $L(S)$ of a subset $S$ of a vector space;
- Both the sets $ S_1=\{(1,0),(0,1)\} $ and $ S_2=\{(x,x+1),x\in\mathbb{R}\} $ have the same span $\mathbb{R}^2$;
- We can delete infinitely many elements from $ S_2 $ and still retain the same span but no point can be deleted from $ S_1 $ in order to retain the same span;
- Define a basis by: $S$ is a basis if $S$ spans the space $V$ and no proper subset of $S$ spans $V$;
- Assume the existence of a basis for any vector space;
- Define a linearly independent set by: $S$ is linearly independent if $\forall \alpha\in S$, $\alpha\notin L(S\setminus\{\alpha\})$;
- Define a maximal linearly independent set by: $S$ is a maximal linearly independent set if for every superset $S'\supset S$, $S'$ is not linearly independent;
- Show that $S$ is a basis iff $S$ is a maximal linearly independent set;
- Existence of a maximal linearly independent set is guaranteed by the assumed existence of a basis.
After this, I want to define the dimension of a vector space. For that, I need to show that every basis or every maximal linearly independent set in $V$ has the same cardinality. And I am stuck to prove this.
A few things to mention:
- This is a primary course on vector spaces and the existence theorem for a basis (and the Zorn's lemma) is not there in their syllabus.
- I am eventually going to move to finite dimensional spaces, but only after defining dimension in the general set up.
I have not found this approach anywhere. So if you know about this approach discussed in any book, please mention it. Otherwise, please help me to establish that any two bases of a vector space have the same cardinality.
The approach that I've been taught with, which I think is more conventional, is to first learn spans, then linear independence, then define a basis and show that all bases have the same cardinality (at least in a finite dimensional case. I'm not so sure about infinite dimensional cases).
To prove the statement that every basis has the same cardinality, I'll make use of theorems on matrices and theire row-reduced echelon forms.
Lemma 1: If $B=\{v_1, v_2, \dots, v_n\}$ is a basis for a vector space $V$, then any set of vectors in $V$ with more than $n$ elements is linearly dependent.
Proof
let $S$ be a subset of $V$ with more thatn $n$ elements. in particular, let $\alpha_1, \alpha_2, \dots, \alpha_m$ be distinct vectors in $S$ with $m > n$.
$c_1\alpha_1 + c_2\alpha_2 + \dots + c_m\alpha_m = 0$
Converting everything to coordinates of the basis, we get
$c_1[\alpha_1]_B + c_2[\alpha_2]_B + \dots + c_m[\alpha_m]_B = 0$
But $[\alpha_i]_B$ are vectors in $\mathbb F^n$ where $\mathbb F$ is the field of the vector space (you can take it to be $\mathbb R$ if you haven't learnt fields).
If we form the matrix $\left( [\alpha_1]_B \;\; [\alpha_2]_B \;\; \dots \;\;[\alpha_m]_B\right)$, then this is a $n\times m$ matrix with $n < m$, so it's row-reduced echolon form will certainly have a non-pivot column and thus there are non trivial solutions for $c_1, c_2, \dots, c_m$. $\;\;\blacksquare$
Lemma 2: If $B=\{v_1, v_2, \dots, v_n\}$ is a basis for a vector space $V$, then any set of vectors in $V$ with less than $n$ elements does not span $V$.
Proof
By following the same procedure in the previous example, you'll get an $n\times m$ matrix where $n > m$, so it's row-reduced echelon form will have a zero row, let's say in row $i$. Then a vector whose $i^{th}$ coordinate is not zero, such as $v_i$, will not be in the span of this set. $\;\;\blacksquare$
By combining these two lemmas, every basis must therefore have the same cardinality.