I'm studying about special unitary groups and came across a problem that I'm having trouble with.
If we define a special unitary group of degree $2$ as:
$$\text{SU}(2) = \left\{ \begin{bmatrix}\phantom{-}a & b \\ - \bar{b} & \bar{a} \end{bmatrix} : a\bar{a} + b\bar{b} = 1 \right\}$$
If the inner product of two complex vectors $\mathbf{a}$ and $\mathbf{b}$ is defined as:
$$\sum_{i} a_i \bar{b}_i$$
then show that the row vectors form an orthonormal basis.
I first started with the fact that by definition if a set of vectors form an orthonormal basis then:
- They are orthogonal to each other.
- They have magnitudes of $1$.
- They are linearly independent.
I was able to easily show the first condition by:
$$ \begin{align} \begin{bmatrix} a \\ b \end{bmatrix} \cdot \begin{bmatrix}-\bar{b} \\ \phantom{-} \bar{a}\end{bmatrix} & = a \overline{(-\bar{b})} + b\overline{(\bar{a})} \\ & = -ab + ba \\ & = 0 \end{align} $$
How do I progress with the second and third? I've tried something like:
$$ \left\Vert \begin{bmatrix}a \\ b \end{bmatrix} \right\Vert^2 = a^2 + b^2 $$
but I don't know how to progress from here.
Note that if $[a,b]$ is a row of such a matrix then
$$\|[a,b]\|^2 = [a,b] \cdot [a,b] = a\bar{a}+b\bar{b} = 1 $$
As for linear independence, if $$v=\lambda[a,b] + \mu[-\bar{b},\bar{a}]=0$$ then use that $v\cdot[a,b]=0\cdot[a,b]=0$ to deduce $\lambda=0$:
$$0=v\cdot[a,b] = \lambda([a,b]\cdot[a,b]) + \mu([-\bar{b},\bar{a}]\cdot[a,b]) = \lambda$$
Similarly, $\mu=0$.