linear combination, span, independence and bases for infinite dimensional vector spaces.

1.6k Views Asked by At

I've only recently started studying linear algebra using some lecture notes by Evan Dummit (https://math.la.asu.edu/~dummit/docs/linalgprac_2_vector_spaces.pdf).

After defining vector spaces, the notions of linear combination, span, generating set and linear independence are introduced. All of this culminates in the definition of a basis for a vector space followed by the dimension.

Def: A vector $w$ is a linear combination of a set of vectors $v_{1}, v_{2},...,v_{n}$ if $\exists$ scalars $a_{1}, a_{2},..., a_{n}$ s.t. $w=a_{1}v_{1}+a_{2}v_{2}+\cdots+a_{n}v_{n}$. Even though it is not explicitly stated this is a finite set of vectors since otherwise the expression does not have any meaning.

Def: The span of a set of vectors $S=\{v_{1}, v_{2},...,v_{n}\}$ is the set of all linear combinations of $S$.

Def: Given a vector space $V$, we say that $S$ is a generating set for $V$ if $\operatorname{span}(S)=V$. This means that every vector in $V$ can be written as a linear combination of the vectors in the set $S$.

Def: A finite set of vectors $v_{1}, v_{2},...,v_{n}$ is linearly independent if $a_{1}v_{1}+a_{2}v_{2}+\cdots+a_{n}v_{n}=0$ implies that $a_{i}=0$ $\forall i$. An infinite set of vectors is linearly independent if every finite subset is linearly independent (this is again because a linear combination of infinitely many vectors does not make sense).

Def: Given a vector space $V$, we say that an independent set of vectors which spans $V$ is a basis.

So far so good with the definitions, but there is one thing that I just couldn't understand so far. Given the basis we can talk about the dimension of the vector space (which is the number of basis elements) and there are also infinite-dimensional vector spaces. However, there is also a theorem that states that every vector space (finite- or infinite-dimensional) has a basis.

So my question is how a basis can even exist for the infinite-dimensional case when the definition of a linear combination only makes sense for finitely many vectors and the basis in this case has an infinite number of elements by definition.

Can someone please point me in the correct direction? What am I missing?

Thanks very much!

3

There are 3 best solutions below

4
On BEST ANSWER

As an example, take the space $V$ of all sequences $(a_n)_{n\in\mathbb N}$ of real numbers such that $a_n=0$ if $n$ is large enough. A basis of $V$ is the set $\{e_1,e_2,e_3,\ldots\}$, where $e_k$ is the sequence such that its $k$th term is $1$ and all other terms are equal to $0$. And this set is a basis of $V$ because if $(a_n)_{n\in\mathbb N}\in V$, then, for some $N\in\mathbb N$, $a_n=0$ if $n>N$ and$$(a_n)_{n\in\mathbb N}=a_1e_1+a_2e_2+\cdots+a_Ne_N.$$So, as you can see, even though $\dim V=\infty$, every element of $V$ is a linear combination of a finite number of elements of the set $\{e_1,e_2,e_3,\ldots\}$.

1
On

A basis $\mathcal B$ can indeed have an infinite number of elements. However the span $S$ of $\mathcal B$, is the set of vectors that are written as finite linear combinations of elements of $\mathcal B$.

Those two facts are not incompatible. In particular, any element $v \in \mathcal B$ is an element of $S$ as $v = 1 \cdot v$. As is the sum of any two elements of $\mathcal B$.

What is interesting however is that for a given vector space $V$, the cardinal of any basis of $V$ is the same. This enables to speak of the dimension of a vector space.

2
On

All those definitions remain true for infinite dimensional spaces (spaces with an infinite basis). But they are not useful in the infinite dimensional spaces mathematicians and physicists most care about.

Those spaces usually have enough structure to make sense of infinite sums. Here's one classic example.

Let $H$ be the set of all sequences $(a_n)$ of real (or complex) numbers such that the sum $\Sigma a_n^2$ converges. It's clear that $H$ is closed under vector summation and scalar multiplication: those happen element by element. Then you can define the distance between any two vectors by analogy with the Euclidean distance:

$$ |v-w| = \sqrt{\sum_{n = 1}^\infty (v_i - w_i)^2} $$

With that definition you can make sense of some infinite sums of vectors, and use those infinite sums to define independence, span and basis. The set of vectors $e_i$ where for each $i$ the vector $e_i$ has a $1$ in place $1$ and is $0$ elsewhere is a basis.

If you think about replacing the sums in that example by integrals you can build even more interesting and useful vector spaces. The study of Fourier series can be thought of as understanding that the set of functions $\{ \sin nx, \cos nx\}$ forms a basis for the space of (nice enough) periodic functions.