How to prove the statement in case of finite and infinite dimensional vector space?

69 Views Asked by At

I want to prove that:

A chain of linearly independent sets has an upper bound.

Could someone tell me the difference between the finite and infinite cases please?

1

There are 1 best solutions below

0
On BEST ANSWER

I think the comments are correct, but are not really relevant. In fact,

The proof is no different in the finite and infinite dimensional case. There is no difference between the two cases.

I consider a "chain" to be indexed by a set, and there is no requirement that distinct indices yield distinct sets. So I could index the same set with infinitely many distinct indices, and that would still be an "infinite chain", even though contains only finitely many distinct sets. But this is really irrelevant to the issue here.

The key observation is that linear dependence/independence really only depends on what happens on finite subsets. A set $S$ of vectors is linearly independent if and only if every finite subset of $S$ is linearly independent. And this holds because a "linear combination" can only involve finitely many terms in the sum.

So: let $V$ be a vector space, let $I$ be an index set, and for each $i\in I$, let $C_i\subseteq V$ be a linearly independent subset of $V$ such that $\{C_i\}_{i\in I}$ forms a chain. That is, for every $i,j\in I$, either $C_i\subseteq C_j$, or $C_j\subseteq C_i$. Note that this implies that given any finite collection of elements of the family, $C_{i_1},\ldots,C_{i_m}$, there exists one that contains all the others (you can prove that by induction on $m$).

We want to show that the chain is bounded above; that is, that there exist a linearly independent subset $S$ of $V$ such that $C_i\subseteq S$ for every $i\in I$.

I claim that $$C = \bigcup_{i\in I}C_i$$ is a linearly independent subset of $V$; and this is an "upper bound" for the chain.

The fact that $C_i\subseteq C$ for each $i\in I$ is of course immediate from the definition of $C$. The real "meat" here is proving that $C$ is linearly independent.

To that end, let $v_1,\ldots,v_n\in C$, and let $\alpha_1,\ldots,\alpha_n$ be scalars such that $$\alpha_1v_1+\cdots+\alpha_nv_n = \mathbf{0}.$$ We need to show that $\alpha_1=\cdots=\alpha_n=0$. Because of the definition of $C$, for each $i=1,\ldots,n$, there is an index $k_i$ such that $v_i\in C_{k_i}$.

Because $\{C_i\}_{i\in I}$ is a chain, there is a $j$, $1\leq j\leq n$, such that $C_{k_i}\subseteq C_{k_j}$ for all $i$.

That means that in fact we have a single index $k_j$ such that $v_1,\ldots,v_n\in C_{k_j}$. And we also know that $C_{k_j}$ is linearly independent. That means that because we have a linear combination of vectors of $C_{k_j}$ that is equal to $\mathbf{0}$, it follows that every coefficient of $$\alpha_1v_1+\cdots+\alpha_nv_n = \mathbf{0}$$ must equal $0$. That is, $\alpha_1=\cdots=\alpha_n=0$.

But this is what we wanted to prove. And this shows that $C$ is indeed linearly independent, as claimed.

Note that we do not need to consider the cases of $V$ being finite dimensional or infinite dimensional separately. The argument holds regardless of the dimension of $V$.