How do I find $\operatorname{det} T_Q$?

130 Views Asked by At

Let $S$ be the space of all $n \times n$ real skew symmetric matrices and let $Q$ be a real orthogonal matrix. Consider the map $T_Q: S \to S$ defined by $$T_Q(X) = QXQ^T.$$ Find $\operatorname{det} T_Q$.

I thought about diagonalizing $Q$, but I don't think we know it is real diagonalizable. I can show it is an isometry using the Hilbert-Schmidt inner product, but I can't really relate it to the determinant of $Q$ (I've seen posts here that say the determinant should be $\operatorname{det}Q^{n-1}$). So all I know is that $\operatorname{det} T_Q = \pm1$. How would I find $\det T_Q$?

2

There are 2 best solutions below

14
On

Partial Answer: I will focus on the case that $n$ is even. The odd case can be handled similarly.

As a consequence of the block-diagonalizability of skew-symmetric matrices and the fact that $Q = \exp(P)$ holds for some skew-symmetric matrix $P$, we can show that there exists an orthogonal matrix $W$ such that $$ WQW^T = D:= \pmatrix{A_1\\ & \ddots \\ && A_k}, \quad A_k = \pmatrix{a_k & -b_k\\ b_k & a_k}. $$ Note that $T_Q = T_W \circ T_D \circ T_W^{-1}$, so that $\det(T_Q) = \det(T_D)$. Decompose $D$ into a product $D = D_1 \cdots D_k$ where $$ D_1 = \pmatrix{A_1 \\ & I \\ & & \ddots \\ &&& I}, \dots, \quad D_k = \pmatrix{I \\ & \ddots \\ & & I \\ &&& A_k}. $$ Note that $T_D = T_{D_1} \circ \cdots \circ T_{D_k}$. With that, it suffices to determine $\det T_{D_k}$.


Another approach is to use the fact that there exists a skew-symmetric matrix $M$ for which $Q = \exp(M)$ and that $T_Q = \exp(C_M)$, where $$ C_M(X) = MX - XM. $$ From there, we can use the fact that $$ \det(T_Q) = \det(\exp(C_M)) = \exp(\operatorname{tr}(C_M)). $$ Once we show that $\operatorname{tr}(C_M) = (n-1)\operatorname{tr}(M)$, it follows that $$ \begin{align} \det(T_Q) &= \exp(\operatorname{tr}(C_M)) = \exp((n-1)\operatorname{tr}(M)) = \exp(\operatorname{tr}(M))^{n-1} \\ & = \det(\exp(M))^{n-1} = \det(Q)^{n-1}. \end{align} $$

0
On

revised proof:
I revised this into $2$ distinct proofs of the result, one analytic and one algebraic. In both cases, the key insight comes from examining a very simple reflection matrix
$D:= \displaystyle \left[\begin{matrix}-1 & \mathbf 0 \\ \mathbf 0 & I_{n-1}\end{matrix}\right]$

and computing
$T_D\mathbf B = \mathbf B \displaystyle \left[\begin{matrix}-I_{n-1} & \mathbf 0 \\ \mathbf 0 & I_{\binom{n}{2}-(n-1)}\end{matrix}\right]=\mathbf BA$
where $\mathbf B$ is a collection of well-chosen (skew symmetric matrix) basis vectors. This computation is shown at the very end under "Computing $T_D\mathbf B$"

The conclusion ultimately is that $\det\big(T_Q\big) =\operatorname{det}\big(Q\big)^{n-1}$

1.) analytic proof:
the determinant is a (real) continuous function and integer valued (taking on values in $\big\{-1,+1\big\}$ as identified by OP), hence constant on any connected component. (I.e. if maps in this group are path-connected, then they have the same determinant.)
Case 1:
$\det\big(Q\big)=1$
Then $Q$ is path connected to the identity, so $T_Q$ is path-connected to $T_I$ (where $T_IX = IXI=X$). Thus, $\det\big(T_Q\big) =\det\big(T_I\big) =1$.

Case 2:
$\det\big(Q\big)=-1$
$Q$ is path connected to $D$, so $T_Q$ is path connected to $T_D$ and
$\det\big(T_Q\big)=\det\big(T_D\big)=(-1)^{n-1}$

2.) algebraic proof:
observe that for $Q_1, Q_2 \in O_n\big(\mathbb R\big)$
$T_{(Q_1Q_2)}X = Q_1Q_2XQ_2^T Q_1^T=T_{Q_1}T_{Q_2}X$
and in the case of $Q_1=Q_2^T$ then $T_{Q_1}=T_{Q_2}^{-1}$

(ignoring the trivial $n=1$ case) for arbitrary $Q \in O_n\big(\mathbb R\big)$, first decompose $Q$ into $r$ Householder matrices $H_j$, for some $1\leq r\leq n$, where we know $r$ is even if $\det\big(Q\big)=1$ and odd if $\det\big(Q\big)=-1$. In the below $U_k$ is some orthogonal matrix and $W_k$ is an orthogonal matrix as well, each of the appropriate dimension

$T_Q\mathbf B$
$= T_{(H_1H_2\cdots H_r)}\mathbf B$
$= T_{H_1}T_{H_2}\cdots T_{H_r}\mathbf B$
$= T_{U_1DU_1^T}T_{U_2DU_2^T}\cdots T_{U_rDU_r^T}\mathbf B$
$= \big(T_{U_1}T_DT_{U_1}^{-1}\big)\big(T_{U_2}T_{D}T_{U_2}^{-1}\big)\cdots \big(T_{U_r}T_DT_{U_r}^{-1}\big)\mathbf B$
$= \mathbf B \big(W_1 A W_1^{-1}\big)\big(W_2 A W_2^{-1}\big) \cdots \big(W_r A W_r^{-1}\big)$
$\implies \det\big(T_Q\big) = \det\big(W_1 A W_1^{-1}\big)\det\big(W_2 A W_2^{-1}\big) \cdots \det\big(W_r A W_r^{-1}\big)=\det\big(A\big)^r = \Big((-1)^{n-1}\Big)^r$
which is to say $\det\big(T_Q\big)=1$ if $n$ is odd and/or $\det\big(Q\big) = 1$, and
$\det\big(T_Q\big)=-1$ in the case of even $n$ and $\det\big(Q\big) =-1$.

The technique here is to examine some group homomorphism (determinant) by decomposing the group into its generators (Householder matrices) and then examine how the generators look under the homomorphism.


Computing $T_D\mathbf B$:

$D:= \displaystyle \left[\begin{matrix}-1 & \mathbf 0 \\ \mathbf 0 & I_{n-1}\end{matrix}\right]$
now construct a simple basis for your space of real skew symmetric matrices (where $\mathbf e_k$ is the kth standard basis vector in $\mathbb R^n$).

$v_1 := \mathbf e_1\mathbf e_2^T-\big(\mathbf e_1\mathbf e_2^T\big)^T$
$v_2 := \mathbf e_1\mathbf e_3^T-\big(\mathbf e_1\mathbf e_3^T\big)^T$
$v_3 := \mathbf e_1\mathbf e_4^T-\big(\mathbf e_1\mathbf e_4^T\big)^T$
$\vdots$
$v_\binom{n}{2} := \mathbf e_{n-1}\mathbf e_{n}^T-\big(\mathbf e_{n-1}\mathbf e_{n}^T\big)^T$
(the pattern is hopefully clear -- each matrix is all zeros, except for a single 1 and a single negative one)

collect these in
$\mathbf B:= \bigg[\begin{array}{c|c|c|c|c} v_1 & \cdots &v_{n-1}& v_n &\cdots & v_\binom{n}{2}\end{array}\bigg]$
(Artin would refer to this as a 'hyper-vector' though not many other texts use this term)

Now by inspection $T_D$ leaves all vectors (skew matrices) unchanged if they have all zeros in the first column/row. Conversely $T_D$ negates a vector (skew matrix) that only has non-zero components in its first column/row (where we recall that since the vector space is of real skew matrices, the diagonals are of course zero). Put differently

for $k\in\big\{1,2,...,n-1\big\}$
$T_Dv_k = -v_k$
(i.e. all skew-symmetric matrix basis vectors that have a one in the first row)

and for $k\in\big\{n+1,n+2,...,\binom{n}{2}\big\}$
$T_Dv_k = v_k$

to conclude
$T_D\mathbf B = \mathbf B \displaystyle \left[\begin{matrix}-I_{n-1} & \mathbf 0 \\ \mathbf 0 & I_{\binom{n}{2}-(n-1)}\end{matrix}\right]= \mathbf BA$

$\implies\det\big(T_D\big)= \det\big(A\big)= (-1)^{n-1}\cdot 1= (-1)^{n-1}$