Irreducible representations of $SO(2)$ on 2x2 matrices.

3.3k Views Asked by At

I'm having trouble verifying my understanding of the representation theory of Lie groups (which is minimal) with my experience playing around with the rotations of 2x2 matrices.

Specifically, if we consider the space of 2x2 matrices $M_2$ being acted on by the group of rotations $SO(2)$, it's pretty easy to figure out what the irreducible representations are. $SO(2)$ is generated by

$$R(\theta)=\left(\begin{array}{cc}\cos\theta&\sin\theta\\-\sin\theta&\cos\theta\end{array}\right)$$

and under the rotation $R(\theta)M_2R^{-1}(\theta)$, there are three irreducible subspaces:

$$\left\{\left(\begin{array}{cc}1&0\\0&1\end{array}\right)\right\},\left\{\left(\begin{array}{cc}0&1\\-1&0\end{array}\right)\right\},\left\{\left(\begin{array}{cc}0&1\\1&0\end{array}\right),\left(\begin{array}{cc}1&0\\0&-1\end{array}\right)\right\}$$

You can get this just by playing around with Mathematica for a while. I would called this $1\oplus 1\oplus 2$.

However, if I want to try to figure this out "in the usual way", I would do the following:

  • Determine the irreducible representations on a single vector space $V$.
  • Take the tensor product of those $V^*\otimes V$.

To do the first step, I would normally do something like look for a Cartan Subalgebra to construct a diagonal representation, and then look at the action of the other generators to find the irreducible ones. This group is generated by a single element, but it seems like the essence of the process should still work.

So let's find a diagonal representation, by just looking for eigenvectors of that rotation matrix. The answer is $$\lambda_{\pm}=\cos\theta\pm i\sin\theta$$ So there are two eigenvectors $v_+$ and $v_-$, and no other generators, so the irreducible representation is $V=v_+\oplus v_-$.

(that seems right - no vector in $\mathbb{R}^2$ should be left invariant by this rotation)

So now the tensor product is generated by $$(v_+\oplus v_-)\otimes(v_+\oplus v_-),$$ with eigenvectors $$R(\theta)(v_+\otimes v_+)=\lambda_+^2(v_+\otimes v_+)$$ $$R(\theta)(v_+\otimes v_-)=\lambda_+\lambda_-(v_+\otimes v_-)$$ $$R(\theta)(v_-\otimes v_+)=\lambda_+\lambda_-(v_-\otimes v_+)$$ $$R(\theta)(v_+\otimes v_-)=\lambda_-^2(v_-\otimes v_-)$$

So I might now say that the three invariant subspaces above are generated by $v_+\otimes v_+$, $v_-\otimes v_-$, and $v_+\otimes v_-+v_-\otimes v_+$, respectively. But for example, $v_+\otimes v_+$ sure looks more like

$$\left(\begin{array}{cc}a&a\\b&b\end{array}\right)$$

not

$$\left(\begin{array}{cc}1&0\\0&1\end{array}\right)\sim\left(\begin{array}{cc}a&0\\0&a\end{array}\right)$$

Is there something here I am missing?

2

There are 2 best solutions below

2
On BEST ANSWER

Your expectation that $v_{+}\otimes v_{+}$ ought to correspond to the identity is incorrect. The tensor product of two vectors $v$ and $w$ is (as a matrix)

$$\begin{bmatrix} v_1w_1 & v_1w_2 \\ v_2w_1 & v_2w_2 \end{bmatrix}$$

Second, if you're talking about $M_2(\mathbb{R})$ there are the three listed irreps, but if you're talking about $M_2(\mathbb{C})$, then the 2D rep splits as two 1D irreps for a total of four irreps. (And you need to work over the complex numbers for the eigenvectors to exist since $\mathbb{R}$ is not algebraically closed.)

Within $\mathbb{C}^2$, the eigenvectors of $R(\theta)$ are

$$ v_{+}=\begin{bmatrix} 1 \\ i \end{bmatrix}, \qquad v_{-}=\begin{bmatrix} 1 \\ -i \end{bmatrix} $$

And the 2D rep within $M_2(\mathbb{C})$ splits into the complex 1D spans of

$$ \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix} \pm i\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \quad \longrightarrow \quad \left\{\begin{bmatrix} 1 & i \\ i & -1 \end{bmatrix}\right\}, ~ \left\{\begin{bmatrix} 1 & -i \\ -i & -1 \end{bmatrix} \right\}. $$

Note that the matrices arising from the eigenvectors are

$$ v_+ v_+^T=\begin{bmatrix} 1 & i \\ i & -1 \end{bmatrix}, \quad v_+v_-^T=\begin{bmatrix} 1 & -i \\ i & 1 \end{bmatrix}, \quad v_-v_+^T=\begin{bmatrix} 1 & i \\ -i & 1 \end{bmatrix}, \quad v_-v_-^T=\begin{bmatrix} 1 & -i \\ -i & -1\end{bmatrix}.$$

The matrices $v_+v_-^T$ and $v_-v_+^T$ are fixed under conjugation by $R(\theta)$, and span the same complex 2D subspaces as your original two real fixed matrices. The matrices $v_+v_+^T$ and $v_-v_-^T$ are eigenmatrices with eigenvalues $e^{2i\theta}$ and $e^{-2i\theta}$ respectively. For example,

$$R(v_+v_+^T)R^{-1}=Rv_+v_+^TR^T=(Rv_+)(Rv_+)^T=e^{2i\theta}(v_+v_+^T).$$

2
On

Firstly, in your solution it seems you're looking at $V\otimes V$ (where $V=\mathbb{C}^2$). If instead we consider $V\otimes V^*$, this is isomorphic to $M_2(\mathbb{C})$, via $$ v\otimes \langle w,\cdot\rangle \longleftrightarrow vw^\top, $$ where $\langle v,w\rangle = v^\top\overline{w}$ denotes the Hermitian inner product on $\mathbb{C}^2$. The corresponding $\operatorname{SO}(2)$-action in $M_2(\mathbb{C})$ is just conjugation. Then the eigenstates in this representation are $$ v_+v_+^\top,\ v_+v_-^\top,\ v_-v_+^\top,\ v_-v_-^\top, $$ with eigenvalues $e^{2i\theta}, 1, 1, e^{-2i\theta}$ respectively. Note the irreducible subspaces are all (complex) one-dimensional, as they must be since $\operatorname{SO}(2)$ is abelian.

Secondly, $v_\pm = e_1\pm ie_2$, where $e_1,e_2$ are the standard basis of $\mathbb{C}^2$. So for example $$ v_+v_+^\top = \begin{pmatrix}1 \\ i\end{pmatrix}\begin{pmatrix}1 & i\end{pmatrix} = \begin{pmatrix}1 & i \\ i & -1\end{pmatrix}. $$

The real irreducible subspaces have bases that are linear combinations of these complex bases. So for example, the real 2-dimensional irrep is spanned by $$v_+v_+^\top+v_-v_-^\top\quad \textrm{and}\quad i(v_+v_+^\top - v_-v_-^\top)$$.