In my Mathematical Techniques course we've been talking about vector spaces, bases, etc.
There is one problem however that I cannot get my head around and that is to prove that $\mathbb{C}$ can be represented as a 2 dimensional vector space over the reals, $\mathbb{R}^2$. I'm fine with proving the associativity, commutativity and stuff however I'm not sure whether the basis vectors should have one or two elements.
What are the basis vectors? $$\begin{bmatrix} 1 \end{bmatrix} \begin{bmatrix} i \end{bmatrix} or \begin{bmatrix} 1 \\ 0 \end{bmatrix} \begin{bmatrix} 0 \\ 1 \end{bmatrix} or \begin{bmatrix} 1 \\ 0 \end{bmatrix} \begin{bmatrix} 0 \\ i \end{bmatrix}$$ Such that: $$ a + bi \rightarrow a \begin{bmatrix} 1 \end{bmatrix} +b\begin{bmatrix} i \end{bmatrix} ~or~ \begin{bmatrix} a \\ 0 \end{bmatrix} +\begin{bmatrix} 0 \\ b \end{bmatrix} ~or~ \begin{bmatrix} a \\ 0 \end{bmatrix} +\begin{bmatrix} 0 \\ bi \end{bmatrix}$$
For some reason I can't get my head around this part of the problem. My lecturer was very vague when he covered this bit which has left me a little dubious.
Thanks, Sean.
In order to identify $\mathbb{C}$ as a $2$-dimensional vector space over $\mathbb{R}$ (note that we do not say it is a vector space over $\mathbb{R}^2$---"over $\mathbb{R}$" simply reminds us what set we're using for scalar multiplication), we need to specify a basis (over $\mathbb{R}$), necessarily of two elements. We usually write elements of $\mathbb{C}$ as $$a + bi,$$ where $a$ and $b$ are (unique) real numbers which implicitly defines the basis $(1, i)$ of $\mathbb{C}$ over $\mathbb{R}$. On the other hand, we know that any $n$-dimensional vector space over $\mathbb{R}$ is isomorphic to $\mathbb{R}^n$, and we usually think of $\mathbb{R}^n$ as the set of vectors with $n$ real entries.
We can connect this with our usual complex notation by writing down an explicit isomorphism. As usual, it is enough to specify what happens to basis elements. One natural choice for the isomorphism is the map $\Phi$ defined by the assignments $$1 \mapsto \begin{pmatrix}1 \\ 0\end{pmatrix}, \qquad i \mapsto \begin{pmatrix}0 \\ 1\end{pmatrix}.$$ By linearity (over $\mathbb{R}$), the general formula for $\Phi$ must then be $$\Phi: a + bi \mapsto \begin{pmatrix}a \\ b\end{pmatrix}.$$
Remark Note that we can also represent complex numbers $a + bi$ as matrices, via the isomorphism $$\Psi: a + bi \mapsto \begin{pmatrix} a & -b \\ b & a \end{pmatrix}.$$ The repeated entries may seem redundant but this map has the additional benefit that $\Psi$ turns multiplication of complex numbers into multiplication of matrices, that is, $\Psi$ respects the addition and multiplication operations of complex numbers and of complex matrices, and we call such maps ring isomorphisms.