Proof, wheather a subset of a Group is a Subgroup

71 Views Asked by At

I have to check, weather the following subset of a group is also a subgroup:

$$U = \left\{ \begin{pmatrix} a & -b \\ \overline{b} & \overline{a} \end{pmatrix} \in GL(2, \mathbb{C}) \bigg\vert |a|^2 + |b|^2 = 1, \ a,b \in \mathbb{C} \right\} \subset (GL(2,\mathbb{C}), \cdot)$$

I did the following:
1) $\forall x,y \in U: x\cdot y \in U$
$$\begin{pmatrix} a & -b \\ \overline{b} & \overline{a} \end{pmatrix} \cdot \begin{pmatrix} a' & -b' \\ \overline{b'} & \overline{a'} \end{pmatrix} = \begin{pmatrix} aa'-b\ \overline{b'} & -ab'-b\ \overline{a'} \\ \overline{b}a'+\ \overline{ab'} & -\overline{b}b' +\overline{a'a} \end{pmatrix}$$

For $|a|^2 + |b|^2 = 1$ and $|a'|^2 + |b'|^2 = 1$. To be honest - I am unable to check weather that is $\in U$ - how can this be done?

2) $e\in U$
Let $a = 1$ and $b = 0$ we have $\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = e$ and $|1|^2 + |0|^2 = 1 \Rightarrow e \in U$.

3) $\forall x \in U: x^{-1} \in U$
What I did was just calculating the inverse matrix but I am stuck:

$$\left( \begin{array}{cc|cc} a & -b & 1 & 0 \\ \overline{b} & \overline{a} & 0 & 1 \end{array}\right)$$ $$a,\ \overline{a}\ne0,\ I \cdot \frac{1}{a}; II \cdot \frac{1}{\overline{a}} $$ $$\left( \begin{array}{cc|cc} 1 & -\frac{b}{a} & \frac{1}{a} & 0 \\ \frac{\overline{b}}{\overline{a}} & 1 & 0 & \frac{1}{\overline{a}} \end{array}\right)$$

And well - now what ever I add, multiply or subtract I am not able to get the unit matrix on the left side. Is there anything I've overlooked? Because since

$$\det\left| \begin{array}{cc} a & -b \\ \overline{b} & \overline{a} \end{array} \right| = a\overline{a} + b\overline{b} = |a|^2 + |b|^2 = 1 \ne 0$$

every matrix in $U$ should be invertible, right?

Thank you very much for your help!
FunkyPeanut

1

There are 1 best solutions below

0
On BEST ANSWER

For part (1): $$\begin{pmatrix} a & -b \\ \overline{b} & \overline{a} \end{pmatrix} \cdot \begin{pmatrix} a' & -b' \\ \overline{b'} & \overline{a'} \end{pmatrix} = \begin{pmatrix} aa'-b\ \overline{b'} & -ab'-b\ \overline{a'} \\ \overline{b}a'+\ \overline{ab'} & -\overline{b}b' +\overline{a'a} \end{pmatrix}$$

You want here that $$ (aa'-b\ \overline{b'})(\overline{aa'-b\ \overline{b'}}) + (ab'+b\ \overline{a'})(\overline{ab'+b\ \overline{a'}}) =1 $$ but this is not hard to check. You "just" have to write it all out.

For part (3), note that the determinant of any matrix in your subset is $1$ (as you have shown). Recall that the inverse of a matrix in $SL(2,\mathbb{C})$ $$ \pmatrix{a & b \\ c & d} $$ is $$\pmatrix{d & -b \\ -c & a} $$ Now that means that the inverse of one of the matrices $$ \begin{pmatrix} a & -b \\ \overline{b} & \overline{a} \end{pmatrix} $$ is $$ \begin{pmatrix} \overline{a} & b \\ -\overline{b} & a \end{pmatrix} $$ But this is clearly again in the subset.