Doubt about the definition of free vector spaces.

378 Views Asked by At

Introduction:

Suppose you wish to construct a set $F(X)$ of linear combination of elements of a given a set $X$:

$$V = v_{1}x_{1}+\cdots +v_{n}x_{n} \tag{1}$$

where $a\in \mathbb{K}$ and $x^{i} \in X$. This set $X$ could have been endowed with other algebraic structures, but as far as the element $v \in F(X)$ is concerned, these algebraic structures aren't inherited in $F(X)$. With vector addition $\boxplus_{\mathcal{F}(X)}$ and scalar multiplication $\boxdot_{\mathcal{F}(X)}$, the set $F(X)$ becomes a vector space:

$$\mathcal{F}(X) = (F(X),(\mathbb{K},+_{\mathbb{K}},\cdot_{\mathbb{K}}),\boxplus_{\mathcal{F}(X)},\boxdot_{\mathcal{F}(X)})$$

More formally, the "nature" of the Free Vector Space $\mathcal{F}(X)$ is the vector space that consists of functions that are non-zero only at finitely many points in X:

$$\mathcal{F}(X) = \{f:X \to \mathbb{K}\hspace{2mm}|\hspace{2mm} \mathrm{supp}(f)<\infty\} \tag{2}$$

where $\mathrm{supp}(f)$ is the support of the functions. From a subspace delta $\delta(X) \subset \mathcal{F}(X)$:

$$\delta(X) = \{\delta_{a}:X \to \mathbb{K}\hspace{2mm}|\hspace{2mm} \mathrm{supp}(f)<\infty\} \tag{3}$$

with:

$$\delta_{a} = \begin{cases}1, x=a \\ \\ 0, x \neq a\end{cases}$$

we can prove that $\mathrm{Span}\{\delta_{a}\}_{a\in X} \equiv \mathcal{F}(X)$ and that they are a linear independent set of vectors.

Therefore, with $(2)$ and $(3)$, the naive entity $(1)$ gains a formal existence in $\mathcal{F}(X)$ as:

$$f(x) = \sum_{a \in X}f(a)\delta_{a}(x) \tag{4}$$

where, $a,x \in X$ and $\delta_{a}(x)$ are the basis vectors.

My Question:

My question is quite simple, I simply do not understand why we define $(2)$ in that way. My problem isn't the $\mathrm{supp}$ technicality. I mean, why "functions that are non-zero only at finitely many points in X" do the job? (Or, suppose that you are the first mathematician who needs the free vector space structure. Why would you define it as $(2)$ or as "functions that are non-zero only at finitely many points in X"?)

3

There are 3 best solutions below

0
On

We know that a map out of a vector space is determined by its action on a basis, so if we want a free vector space on a set $X$, that means we want to have a basis element for each element of $X$. Let's write $\delta_x$ for the basis element associated to $x \in X$.

What does it mean that the family $\{ \delta_x \}_{x \in X}$ forms a basis? It means that every vector in our vector space can be written as a (finite!) linear combination of the $\delta_x$. That is, every vector looks like

$$a_1 \delta_{x_1} + a_2 \delta_{x_2} + \ldots + a_n \delta_{x_n}$$

for some $x_1, \ldots, x_n \in X$ and some $a_1, \ldots, a_n \in K$.

Now, of course, this is too easy. As mathematicians it's our job to find slick presentations of objects at the risk of confusing new students. Indeed, depending on the mathematicians you talk to, confusing new students is actually a feature rather than a bug (I'm joking, but only a little bit).

Really what happens is we would like to write our free vector space in terms of another, more concrete, vector space so that it's easy to check that it really is a vector space. We need to put some complexity somewhere, and by making the definition a bit more opaque it can make the coming proofs much cleaner. This is great for the person writing the textbook (who knows why the opaque definition works, but doesn't want to check that addition is associative, etc.) but is less great for the person reading the textbook (who is still building intuition for these objects).

So how does the idea of "linear combinations of the $\delta_x$" evolve over time?

Well, notice we need precisely the information of a coefficient $a_x$ for each $\delta_x$. Then we can look at the vector $\sum_{x \in X} a_x \delta_x$. Of course, since we're only allowed finite linear combinations of the basis vectors, we need to know that each of these sums is finite. That is, that all but finitely many of the $a_x$ are $0$.

So we can actually get away with a function $a : X \to K$ so that $a_x = 0$ for all but finitely many $K$. Of course, once we have these functions it's easy to see that we don't need to write down the $\delta_x$ anymore. These functions themselves form a vector space, and the functions $\chi_x : X \to K$ with $\chi_x(y) = \begin{cases} 1 & y=x \\ 0 & y \neq x \end{cases}$ form a basis.

(As a quick exercise, do you see how the function $\chi_x$ corresponds to the vector $\delta_x$ in a natural way?)

So we see that the vector space can be thought of as functions $X \to K$ of finite support. Exactly the definition that was presented to you.


I hope this helps ^_^

0
On

I don't think we choose this particular representation. It comes naturally by the fact that every element $V\in F(X)$ can be represented in a unique way as a combination of the elements of $X$ \begin{equation} V = \sum_{x\in X} v_x x \end{equation} and only a finite number of the $v_x$s are non zero.

It is natural to define the mapping that associates to $V$ and $x$ the coefficient of $V$ on the element $x$ \begin{align} &C: F(X)\times X \to K\\ &(V, x)\mapsto v_x \end{align} Now if one considers $f_V(x) = C(V, x)$, then $f_V$ is a function $X\to K$ with finite support and the mapping $V\mapsto f_V$ is a vector space isomorphism, thus we can identify $V$ and $f_V$.

0
On

There are two ways to Define a Free Vector Space over a set $X$. Given an external composition $c\cdot x$ where $c\in\mathbb{F}$ and $x\in X$ .

Way 1:-

Take the set $X$ and Let $\displaystyle F(X)=\{\sum_{\text{finite}}c_{x}x:x\in X\,,c_{x}\in\mathbb{F}\}$

Or equivalently

$$F(X)=\{\sum c_{x}x:x\in X,c_{x}\in\mathbb{F}\,\text{such that}\,\,c_{x}=0\,\text{for all but finitely many}\,x\in X\}$$.

That is $F(X)$ is the set of all possible finite linear combination of elements of $X$.

Then you see that this set has all the required properties of a vector space.

For example:-

Let $v,v'\in F(X)$. Then $v=\sum c_{x}x$ such that $c_{x}=0$ for all but finitely many $x\in X$.

and $\displaystyle v'=\sum c'_{x}x$ such that $c'_{x}=0$ for all but finitely many $x\in X$.

Then $\displaystyle v+v'=\sum_{x}(c_{x}+c'_{x})x$ is in $F(X)$ as $c_{x}+c'_{x}=0$ for all but finitely many $x\in X$.

You can verify that $F(X)$ satisfies all properties of a vector space.

Way 2.

Let $Fun(X)$ denote the set of all functions from $X\to\mathbb{F}$ such that $f(x)=0$ for all but finitely many $x\in X$.

Then you can again easily verify that this set $Fun(X)$ forms a vector space under pointwise addition of function and $(cf)(x)=c\cdot f(x),\forall x\in X,\,c\in\mathbb{F}$.

Now both these spaces are isomorphic.

Define $\phi:Fun(X)\to F(X)$ such that if $f(x)$ is non zero for $x_{1},x_{2},...x_{n}$ in $X$ and it takes values $c_{1},c_{2},...c_{n}$.

Then $\phi(f)=\sum_{x}c_{x}x$. Where $c_{x}=c_{i}$ for $x=\{x_{1},x_{2},...,x_{n}\}$ and $c_{x}=0$ otherwise.

Then this $\phi$ is an isomorphism.

Now to answer your question why is it defined like that. Why does the functions have to take non-zero value only at finitely many points. The simple reason is that you want something similar to the linear span of a set. And it is precisely defined as the set of all possible finite linear combination of elements of the set. The finite assumtion is vital because we have not made any sense to infinite sums. Keep in mind that there is no notion of convergence here. Now the functions are defined in such a way that you are actually identifying the functions with the elements of the free span of $X$. Hence you do need the sense of finite support if you want these two spaces to be isomorphic.