What linear transformations preserve these conditions?

375 Views Asked by At

Main Question

Let's define $\Gamma(n)$ as the set of real antisymmetric matrices of size $n$ ($n$ is an even Integer), fulfilling: $$ \forall \gamma\in \Gamma(n) \Rightarrow \gamma^2=-\mathbb I_n$$ where $\mathbb I_n$ is the identity matrix of size $n$. What is a nice representation of the set of all linear operators that keep matrices inside $\Gamma(n)$? $$\gamma_{i,j}\to \gamma'_{i,j} = \sum_{i',j'=1}^n \gamma_{i',j'}\beta_{i,i'}\beta_{j,j'}$$ where $\beta \in \mathbb C^{n^2}$.

For instance I know that if $\gamma\to \gamma'=U\gamma U^\dagger$, where $U$ is a unitary matrix then clearly $$\gamma'^2=U\gamma U^\dagger U\gamma U^\dagger=U\gamma^2 U^\dagger=-U\mathbb I_n U^\dagger=-\mathbb I_n$$ but I don't know what unitary matrices preserve antisymmetric-ness of a matrix.


Background

The background is heavily related to physics so I thought I should separate it from the main question. The $\gamma$ matrices that I'm dealing with are called Covariance Matrices(CMs); and the case of this problem they are defined as: $$\gamma_{l,m}=\frac{i}{2}\text{tr}(\rho[c_l,c_m])$$ where $\rho$ is the density matrix of a pure Gaussian state, $c$'s are Majorana mode operators fulfilling $\{c_l,c_m\}=2\delta_{l,m}$ and $[\_,\_]$($\{\_,\_\}$) is a/an (anti-)commutator.

Also, the condition $\gamma^2=-\mathbb I_n$ is satisfied iff the state $\rho$ is pure[1].

2

There are 2 best solutions below

4
On BEST ANSWER

After a series of exchanges in the comment section, the OP has clarified that the true question is as follows:

Let $n$ be an even number and $\Gamma(n)\subset M_n(\mathbb R)$ be the set of all $n\times n$ real antisymmetric square roots of $-I_n$. Find all (complex) unitary matrices $\beta$ such that the mapping $\gamma\mapsto\beta\gamma\beta^\top$ (note: not $\gamma\mapsto\beta\gamma\beta^\dagger$ albeit $\beta$ may be complex) preserves $\Gamma(n)$.

We will prove the followings.

  • When $n\ge4$, the required matrices are given by those $\beta$s such that $\beta$ or $i\beta$ is a real orthogonal matrix.
  • When $n=2$, the required matrices are given by those $\beta$s such that $\beta$ or $i\beta$ is of the form $$ \pmatrix{z\,\cos t&-\bar{w}\,\sin t\\ w\,\sin t&\bar{z}\,\cos t}, $$ where $t$ is an arbitrary real number and $z,w$ are arbitrary complex numbers of unit moduli.

Proof for $n\ge4$. Let $\mathcal K$ denotes the real linear space of all real antisymmetric matrices. Define $h=\beta^\top\beta$. Then $h$ is unitary and the condition $(\beta\gamma\beta^\top )^2 = -I$ implies that $h\gamma = \gamma h^{-1} = \gamma \bar{h}^\top$. Since $\mathcal K$ is the real linear span of $\Gamma(n)$, it follows that $$ hk = k\bar{h}^\top\tag{1} $$ for every $k$ in $\mathcal K$. Denote by $E_{ij}$ the matrix with a $1$ at the $(i,j)$-th position and zeros elsewhere. By considering all $k$s of the form $E_{ij}-E_{ji}$ and compare both sides of $(1)$ elementwise, it is easy to see that $h$ is a real multiple of the identity matrix (that $n>2$ is essential here). Yet, $h$ is unitary. Therefore $h=\beta^\top \beta=\pm I$.

If $\beta^\top \beta=I$, since $\beta$ is also unitary, we have $\beta^\top \beta=I=\beta^\dagger \beta$. Hence $\beta^\top=\beta^\dagger$, i.e. $\beta$ is real. Therefore $\beta$ is real orthogonal.

If $\beta^\top \beta=-I$, then $(i\beta)^\top(i\beta)=I$ and $i\beta$ is also unitary. So, the same argument shows that $i\beta$ is real orthogonal.

Proof for $n=2$. The set $\Gamma(2)$ has only two elements, namely, $g=\pmatrix{0&-1\\ 1&0}$ and $-g$. If the mapping $f:\gamma\mapsto\beta\gamma\beta^\top$ preserves $\Gamma(n)$, either $\beta g\beta^\top=g$ (and $f$ is the identity map) or $\beta g\beta^\top=-g$ (and $-f$ is the identity map). In the latter case, we have $(i\beta) g(i\beta)^\top=g$ and so it suffices to consider the former case only. Since $\beta$ is unitary, the condition $\beta g\beta^\top=g$ is equivalent to $\beta g=g\bar{\beta}$. By comparing elementwise both sides of this equality and also both sides of $\beta\beta^\dagger = I$, the result follows.

0
On

Observe that $\Gamma(n)$ is the set of all real $n \times n$ skew-symmetric orthogonal matrices, i.e., $$ \Gamma(n) = O(n) \cap \mathfrak{o}(n) \cap M_n(\mathbb{R}), $$ and let $$ \mathcal{U}(n) := \{U \in U(n) \mid U\Gamma(n)U^T \subset \Gamma(n)\}. $$ I claim that $$ \mathcal{U}(n) = \begin{cases} \{ \eta U_0 \mid \eta \in \{\pm 1, \pm i\}, \; U_0 \in SU(2)\} &\text{if $n=2$,}\\ O(n) \cup iO(n) &\text{if $n \geq 4$.} \end{cases} $$

Let us first consider the case where $n \geq 4$.

First, observe that if $O \in O(n)$, then for any $\gamma \in \Gamma(n)$, $O \gamma O^T$ and $(iO)\gamma(iO)^T = -O \gamma O^T$ are still real, skew-symmetric, and orthogonal, so that $\mathcal{U}(n) \supset O(n) \cup iO(n)$. Hence, it suffices to show the reverse inclusion, $\mathcal{U}(n) \subset O(n)\cup iO(n)$. To do so, however, we will need the following lemmas.

Lemma 0: Let $v$, $w \in \mathbb{R}^n$ be orthogonal unit vectors. Then there exists some $\gamma_{wv} \in \Gamma(n)$ such that $w = \gamma_{wv} v$ (and hence $v = -\gamma_{wv} w$).

Proof: Let $e_1 = v$ and $e_2 = w$, and complete $\{e_1,e_2\}$ to an orthonormal basis $\{e_1,e_2,\dotsc,e_n\}$ of $\mathbb{R}^n$. Then $\gamma_{vw} \in \Gamma(n)$ defined by $$ \gamma_{wv} e_{2k+1} = e_{2k+2}, \quad \gamma_{vw} e_{2k+2} = -e_{2k+1}, \quad 0 \leq k \leq \frac{n-2}{2} $$ is a skew-symmetric orthogonal real matrix such that $w = \gamma_{wv} v$, as required. QED

Lemma 1: $\{A \in M_n(\mathbb{R}) \mid A = A^T, \; \forall \gamma \in \Gamma(n), \; A \gamma = \gamma A\} = \mathbb{R}I_n$.

Proof: Let $A$ be such a matrix. Let $\{e_1,\dotsc,e_n\}$ be the standard ordered basis of $\mathbb{R}^n$.

Let us first show that the diagonal entries of $A$ are all equal. First, observe that for any $v \in \mathbb{R}^n$, $$ \langle \gamma v, A \gamma v \rangle = \langle \gamma v, \gamma A v \rangle = \langle \gamma^T \gamma v, A v \rangle = \langle v, A v \rangle. $$ But now, by Lemma 0, for any $k \neq l$ there exists $\gamma \in \Gamma(n)$ be such that $e_l = \gamma e_k$, so that $$ A_{ll} = \langle e_l, A e_l \rangle = \langle \gamma e_k, A \gamma e_k \rangle = \langle e_k, A e_k \rangle = A_{kk}. $$ Hence, indeed, all the diagonal entries of $A$ are equal.

Now, let us show that $A$ is diagonal. First, observe that $$ (A\gamma)^T = \gamma^T A^T = -\gamma A = -(A \gamma), $$ so that $A\gamma$ is skew-symmetric, and hence $\langle v, A \gamma v \rangle = 0$ for all $v \in \mathbb{R}^n$. But now, by Lemma 0, for any $k \neq l$ there exists $\gamma \in \Gamma(n)$ be such that $e_l = \gamma e_k$, so that $$ A_{kl} = \langle e_k, A e_l \rangle = \langle e_k, A \gamma e_k \rangle = 0. $$ Hence, $A$ is indeed diagonal.

Finally, since $A$ is diagonal with all diagonal entries equal, it follows that $A = \alpha I_n$ for some $\alpha \in \mathbb{R}$. QED

Lemma 2: Suppose that $n \geq 4$. Then $\{B \in M_n(\mathbb{R}) \mid B = B^T, \; \forall \gamma \in \Gamma(n), \; B \gamma = -\gamma B\} = 0$.

Proof: Let $B$ be such a matrix. First, observe that for any $x \in \mathbb{R}^n$, $$ \langle \gamma x, B \gamma x \rangle = -\langle \gamma x, \gamma B x \rangle = -\langle \gamma^\ast \gamma x , B x \rangle = - \langle x, A x \rangle. $$ Let $u \in \mathbb{R}^n$ be a unit vector. Since $n \geq 4 \geq 3$, find $v$ and $w \in \mathbb{R}^n$ such that $\{u,v,w\}$ is orthonormal; hence, by Lemma 0, find $\gamma_{vu}$, $\gamma_{wv}$, and $\gamma_{uw}$ such that $$ v = \gamma_{vu}u, \quad w = \gamma_{wv} v, \quad u = \gamma_{uw} w. $$ But now, $$ \langle u, B u \rangle = \langle \gamma_{uw} w, B \gamma_{uw} w \rangle = - \langle w, B w \rangle = - \langle \gamma_{wv} v, B \gamma_{wv} v \rangle = \langle v, B v\rangle = \langle \gamma_{vu} u, B \gamma_{vu} u \rangle = - \langle u, B u \rangle, $$ so that $\langle u, B u \rangle = 0$. Since $u$ was an arbitrary unit vector, it follows that $\langle x, B x \rangle = 0$ for all $x \in \mathbb{R}^n$, and hence that $B$ is skew-symmetric. Since $B$ was assumed to be symmetric, this therefore implies that $B = 0$. QED

At last, suppose that $U \in \mathcal{U}(n)$. Then, for any $\gamma \in \Gamma(n)$, $U \gamma U^T$ is real, so that $$ U \gamma U^T = \overline{U \gamma U^T} = \overline{U} \gamma U^\ast, $$ and hence $$ U^T U \gamma = \gamma \overline{U^T U}. $$ Now, write $U^T U = A + iB$, where $A$, $B \in M_n(\mathbb{R})$; since $U^T U$ is symmetric, so too are $A$ and $B$. But then, $$ 0 = U^T U \gamma - \gamma \overline{U^T U} = (A+iB)\gamma - \gamma(A-iB) = (A\gamma - \gamma A) + i (B\gamma + \gamma B), $$ so that since $A$, $B$ and $\gamma$ are all real, $$ A\gamma = \gamma A, \quad B \gamma = - \gamma B. $$ By Lemma 1, then, $A = \alpha I_n$ for some $\alpha \in \mathbb{R}$, whilst by Lemma 2, $B = 0$. Hence, $U^T U = A = \alpha I_n$; since $U^T U$ is unitary, it therefore follows that $\alpha = \pm 1$. But then, since $U^T U = \pm I_n = \pm U^\ast U$, it follows that $U^T = \pm U^\ast$, and hence that $\overline{U} = \pm U$. Thus, $U \in O(n)$ or $U \in iO(n)$, as was claimed.

Now, let us consider the remaining case of $n = 2$, which we may carry out explicitly. On the one hand, you can check directly that $$ \Gamma(2) = \{ \pm \gamma \}, \quad \gamma = \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix}, $$ so that $$ \mathcal{U}(2) = \{ U \in U(2) \mid U \gamma U^T = \pm \gamma \}. $$ So, let $U \in U(2)$, and write $U = \eta U_0$ for $\eta = \det(U) \in U(1)$ and $U_0 = \eta^{-1}U \in SU(2)$. Hence, we can write $$ U = \eta \begin{pmatrix} \alpha & -\overline{\beta} \\ \beta & \overline{\alpha} \end{pmatrix} $$ for $\alpha$, $\beta \in \mathbb{C}$ with $|\alpha|^2 + |\beta|^2 = 1$. But then, $$ U \gamma U^T = \eta \begin{pmatrix} \alpha & -\overline{\beta} \\ \beta & \overline{\alpha} \end{pmatrix} \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix} eta \begin{pmatrix} \alpha & \beta \\ -\overline{\beta} & \overline{\alpha} \end{pmatrix} = \eta^2 \begin{pmatrix} 0 & 1\\ -1 & 0 \end{pmatrix} = \eta^2 \gamma. $$ Hence, $U \in \mathcal{U}(2)$ if only if $\eta^2 = \pm 1$, if and only if $\eta \in \{\pm 1, \pm i\}$, so that $$ \mathcal{U}(2) = \{ \eta U_0 \mid \eta \in \{\pm 1, \pm i\}, \; U_0 \in SU(2)\}. $$