Generalization of $n$-th dimensional vector space isomorphic to $n$-th product of a field

368 Views Asked by At

This question is related to one asked by somebody else here; however, unlike them, I do not want to show that a vector space over a field $F$ and of dimension $n$ is isomorphic to the $n$-th direct power $F^{n}$ of $F$. Instead, I want to generalize this to the following case:

If $B$ is a basis for a vector space $V$ over $F$, I want to show that $V$ is isomorphic to the direct sum $\oplus_{\alpha \in B}F$ of copies of $F$ indexed by $B$.

To that effect, I am hoping to come up with an isomorphic map from $V$ to $\oplus_{\alpha \in B}F$.

So, suppose $V$ is a vector space over $F$. Then, since every vector space has a basis, denote this basis by $B = \{ v_{i} \}_{i \in I}$, where $I$ is some indexing set.

Then, since every vector $w \in V$ can be uniquely represented as a linear combination of basis elements, we have that $w = \sum_{i \in I}x_{i}v_{i}$.

Now, this appears to give us a bijective map $\displaystyle g:V \to \oplus_{\alpha \in B}F$ defined by $\displaystyle w \mapsto \oplus_{i \in I}x_{i}$.

Is this the correct map and/or the correct way to denote it? It seems a little weird to me.

Also, I would need to show that such a map is a homomorphism. I'm not sure how I would write this: I tried letting $w, \overline{w} \in V$ where $\displaystyle \overline{w} = \sum_{i \in I} \overline{x}_{i}\overline{v}_{i}$.

Then, $g(w) = \oplus_{i\in I}x_{i}$, $g(\overline{w}) = \oplus_{i \in I}\overline{x}_{i}$, but I'm not sure how to add these two direct sums together. In addition, does $g(w+\overline{w}) = \oplus_{i \in I} x_{i} + \overline{x}_{i}$? Or does it equal $\oplus_{i \in I} x_{i}\overline{x}_{i}$? I know that the direct sum is the cartesian product when all but finitely many of the summands are zero, but I am getting confused about what each side of this homomorphism should look like - if I am even getting the homomorphism itself correct, that is.

I thank you in advance for your help and patience!

2

There are 2 best solutions below

15
On BEST ANSWER

Think $\bigoplus_{\alpha \in B}F$ as a set consisting of finite support functions $B\rightarrow F$ as AreaMan says in (1.). Every vector in $V$ can be expressed as a finite linear combination of $B$, that is, if $w \in V, w \neq 0$, then $w = \sum_{i = 1}^m a_{s_i}v_{s_i}$ where $v_{s_i} \in B, a_{s_i} \in F, a_{s_i} \neq 0$ (I use $s_i$ as indexes, they are not necessarily the first $m$ vectors of $B$), then your function $g:V\rightarrow \bigoplus_{\alpha \in B}F$ maps $w$ to the function $f_w$ such that: $$ f_w(x) = \begin{cases} a_{s_i} & \text{if } x = v_{s_i} \text{ for some $1\leq i\leq m$}\\ 0 & \text{otherwise} \end{cases} $$ ($f_w$ gives you the coefficients of $w$ when expressed as a linear combination of $B$, and $0 \mapsto f_0$ where $f_0(x) = 0$ for every $x$). You need to check:

  1. $f_w \in \bigoplus_{\alpha \in B} F$ for every $w \in V$ (this happens since $B$ is a basis for $V$).
  2. $g$ is well defined and bijective.
  3. $g$ is a linear transformation.

In $\bigoplus_{\alpha \in B}F$ sum and scalar multiplication is defined pointwise, that is, if $s,t \in \bigoplus_{\alpha \in B}F$ and $\lambda \in F$ then $s+t$ and $\lambda s$ lie in $\bigoplus_{\alpha \in B}F$ and are defined as follows: $$(s+t)(x) = s(x)+t(x)$$ $$(\lambda s)(x) = \lambda s(x)$$ for every $x \in B$ (operations on the right side are those of $F$).

The direct sum of vector spaces is a subspace of the direct product, that is: $$\bigoplus_{i \in I}V_i \leq \prod_{i \in I}V_i,$$ when $I$ is an infinite set the direct sum is a proper subspace, the sum and scalar multiplication is defined pointwise as above.

4
On

What you are doing seems more or less correct, but I notice some confusion about basic ( = fundamental, not trivial) things that hopefully this list of exercsies will help with. Please let me know what isn't clear.

  1. The direct sum $\oplus_{b \in B} F$ can be thought of as functions $\phi : B \to F$ with the property that for all but finitely many $b$, $\phi(b) = 0$. These are added and scaled coordinatewise, like any other vector space of functions.

1b. The direct sum can also be thought of as the space of formal linear combinations of the elements in $B$. I.e. the space with vectors that are formal expressions (strings) that look like $2b_1 + 17b_2$, or in general a finite(!) sum $ a_0 b_0 + \ldots + a_n b_n$ for $a_i \in F$ and $b_i \in B$. These expressions are subject to certain identifications that make the space into a vector space, such as $(b + b') + b'' = b + (b' + b'')$, $b + b' = b' + b$ and $2b + b = 3b$.

Exercise: Prove that these two descriptions produce isomorphic spaces, by writing down a specific, natural isomorphism. You will have to work out a more precise definition of this second description. (If you want to, skip this and also the parenthetical comment in 3. I think that this point of view is helpful though, even if it is confusing at first. It also comes up a lot in algebra. You can google free vector space on a set to find better exposition on this.)

  1. We will use the first description of the direct sum. Given a family of linear maps $T_b : F \to V$ for each $b \in B$, we obtain a map $T: \oplus_{b \in B} F \to V$, by sending a function $\phi : B \to F$ to $\Sigma_{b \in B} T_b(\phi(b))$. Note that only finitely many $\phi(b)$ are nonzero, hence this is a finite sum. Check that this is linear in $\phi$. Next check that, if $i_b : F \to \oplus_{b \in B} F$ is the map that sends $c$ to the function $\phi$ defined by [$\phi(b) = c$ and $\phi(b') = 0$ if $b \not = b'$], then $T \circ i_b = T_b$. (To learn more about this, search for coproducts of vector spaces. Aluffi's Algebra Chapter Zero presumably treats this somewhere - a very lucid book, I highly recommend it.) (Exercise: Phrase this in terms of the second description of direct sum.)
  2. For each element $b \in B$ in a basis for $V$, we have a natural map $T_b : F \to V$ that sends $c$ to $cb$. Let $T$ be the map constructed from this data as in step $2$. (With the second perspective (1b) on the direct sum, $T$ is the map that takes a formal linear combination of basis vectors and evaluates it in $V$.)
  3. This is the inverse to the map you define above. It is worthwhile to check that surjectivity corresponds to the condition that $B$ spanned, and injectivity corresponds to the linear independence of $B$.

Follow up exercise: We didn't really need to be working with $F$ in all of this. Figure out more general versions of these statements.