Showing that a given set is a $\Bbb R$-Basis of $\Bbb C \times \Bbb C$

60 Views Asked by At

Let $e_1=(1,0)$ and $e_2=(0,1)$ be the unit vectors of $\mathbb C^2$.

Show that $B=\{e_1,e_2,ie_1,ie_2\}$ is a $\mathbb R$-Basis of $\mathbb C^2$.

Now, assume we want to use the determinant of the Matrix of Basis to show that the vectors are linearly independent. What I'd do is then:

$\begin{pmatrix}1&0&i&0\\0&1&0&i\end{pmatrix}$

But we can't get the determinant of it. We'd need something like $\begin{pmatrix}1&0&0&0\\0&1&0&0\\0&0&i&0\\0&0&0&i\end{pmatrix}$

But I can't see how we would get this. I see that if we have an $\mathbb R$-Basis, that $\dim(\mathbb C^2)=4$ since $i\not\in\mathbb R$ but I can't see how we'd argue to actually get a $4 \times 4$-Matrix.

What's the proper matrix and how do I get it?

3

There are 3 best solutions below

3
On

It makes sense to write down a matrix only if you already have a basis, and you want to compare your would-be basis to your known basis.

Is it the case here? I don't think so. You don't have a basis of $\Bbb C^2$ to refer to.

Therefore the only way to go is prove that the given family of vectors is linearly independent and a spanning set.

1
On

The determinant of a matrix of vectors method works only when you have one set of vectors with coordinates in another basis. Checking if vectors are linearly independent using their coordinates in the same set of vectors as a basis gives you nothing (you'll always get the identity matrix).

All this is much simpler, though. You know that any vector $x\in\mathbb C^2$ decomposes uniquely as $x=x_1e_1+x_2e_2$, where $x_1,x_2 \in \mathbb C$.

Now, any $c \in \mathbb C$ decomposes uniquely as $c=a+bi$, with $a,b \in \mathbb R$.

Composing this observations, we get a unique decomposition of $x$:

$$x_1 = a_1+b_1 i$$

$$x_2 = a_2+b_2 i$$

$$x = (a_1+b_1i)e_1+(a_2+b_2i)e_2 = a_1 e_1 + b_1 (i e_1) + a_2 e_2 + b_2 (i e_2)$$

The unique decomposition part means that $\{e_1, ie_1, e_2, ie_2\}$ is a basis.


This is essentially using the isomorphism $\mathbb C ^n \cong \mathbb R ^n \otimes \mathbb C$, and there is a well-known way to get a basis for $A\otimes B$ from bases of $A$ and $B$.

2
On

To say that $e_1=(1,0)$ and $e_2=(0,1)$ are the unit vectors of $\mathbb C^2$ means that this couple is a basis for the vector space $\mathbb C^2$ over $\mathbb C$. So, any element of $\mathbb C^2$ can be represented as: $$ (a+ib)(1,0)+(c+id)(0,1)=(a+ib,c+id)=a(1,0)+b(i,0)+c(0,1)+d(0,i) $$

so, since $a,b,c,d \in \mathbb R$ the elements $\{(1,0),(0,1),(i,0),(0,i)\}$ are a basis for $\mathbb C^2$ over $\mathbb R$ .


This proves that the set $E=\{e_1, e_2=ie_1, e_3=ie_1, e_4=i e_2 \}$ is a basis in $\mathbb C^2$ over $\mathbb R$ (note that this is true because we define the additon of elements in this space as te same addition in $\mathbb C^2$ over $\mathbb C$), so we can use these vectors to find the components of any vector to this basis and these components are four real numbers. Obviously the vectors of $E$, in this basis, have components$e_1=(1,0,0,0)^T$ , $e_2=(0,1,0,0)^T$ , $e_3=(0,0,1,0)^T$ and $e_4=(0,0,0,1)^T$ and the matrix that has as columns such components is the identity matrix that has determinant $=1$.