If an orthogonal matrix has determinant -1 then it has -1 as an eigenvalue

2.1k Views Asked by At

I want to show that if a real orthogonal matrix $A$ has determinant $-1$ then $\lambda=-1$ must be an eigenvalue of $A$.

I have proven this in a long-winded way and I was wondering if these is a quick way of seeing it.

3

There are 3 best solutions below

1
On

If $A$ has $-1$ has a determinant, it is a symmetry, since a rotation preserves the orientation. Then $\mathbb{R}^n=U\oplus V$ where $U=\{x:A(x)=x\}$, $V=\{x:A(x)=-x\}$.

You also have $AA^T=I$, $det(A+I)=det(A(I+A^{-1}))=det(A)det(I+A^T)=-det(I+A^T)$.

$(I+A)^T=(I+A^T)$ implies that $det(I+A)=det(I+A^T)$, you deduce that $det(I+A)=-det(I+A)$ and $det(I+A)=0$.

1
On

The matrix is diagonalizable over $\mathbb C$, so the determinant is the product of the eigenvalues.

The complex eigenvalues that are not real come in conjugate pairs, and the product of two conjugate eigenvalues is a positive real. So there has to be at least one negative real eigenvalue.

The only negative real that can be an eigenvalue of an orthogonal matrix (which preserves the Euclidean norm of a vector) is $-1$.


Actually it's not necessary to appeal to diagonalizability; just considering the characteristic polynomial will do.

0
On

Henning's answer uses the assumption that $A$ has real entries, but the claim of the question is true even if we drop it. More precisely, we have the following:

Theorem 1. Let $R$ be any commutative ring with unity. Let $n$ be a nonnegative integer. Let $A\in R^{n\times n}$ be an orthogonal matrix over $R$ (that is, $A^T A=AA^T =I_n $) such that $\det A\neq1$. Then, $\det\left( A+I_n \right) $ is not an invertible element of $R$.

If $R$ is a field, then the conclusion of Theorem 1 can be restated as "$\det\left( A+I_n \right) =0$" (since the only element of $R$ that is not invertible is $0$), and this of course means that $-1$ is an eigenvalue of $A$ (since evaluating the characteristic polynomial of $A$ at $-1$ yields $\det\left( A-\left( -1\right) I_n \right) =\det\left( A+I_n \right) $). If $R$ is furthermore a field of characteristic $0$, then $\det A=-1$ implies $\det A\neq1$. Thus, Theorem 1 is much more general than the claim in question.

Theorem 1 can be proven purely algebraically, without ever leaving the ring $R$:

Proof of Theorem 1. We have $AA^T =I_n $ (since $A$ is orthogonal), thus $I_n =AA^T $. But \begin{equation} \left( A+I_n \right) ^T =A^T +\left( I_n \right) ^T =\underbrace{\left( I_n \right) ^T }_{=I_n =AA^T }+\underbrace{A^T }_{=I_n A^T }=AA^T +I_n A^T =\left( A+I_n \right) A^T . \end{equation}

But every $n\times n$-matrix $B$ satisfies $\det\left( B^T \right) =\det B$. Applying this equality to $B=A$, we find $\det\left( A^T \right) =\det A$. But applying the same equality to $B=A+I_n $, we obtain \begin{equation} \det\left( \left( A+I_n \right) ^T \right) =\det\left( A+I_n \right) . \end{equation} Hence, \begin{align*} & \det\left( A+I_n \right) \\ & =\det\left( \underbrace{\left( A+I_n \right) ^T }_{=\left( A+I_n \right) A^T }\right) =\det\left( \left( A+I_n \right) A^T \right) =\det\left( A+I_n \right) \cdot\underbrace{\det\left( A^T \right) }_{=\det A}\\ & \qquad\left( \text{since }\det\left( XY\right) =\det X\cdot\det Y\text{ for any two }n\times n\text{-matrices }X\text{ and }Y\right) \\ & =\det\left( A+I_n \right) \cdot\det A. \end{align*} If the element $\det\left( A+I_n \right) $ of $R$ was invertible, then we could divide both sides of this equality by $\det\left( A+I_n \right) $, and thus obtain $1=\det A$, which would contradict $\det A\neq1$. Hence, $\det\left( A+I_n \right) $ is not invertible. This proves Theorem 1. $\blacksquare$

Note that the claim of Linear-algebra first course problem about orthogonal matrices can easily be obtained by applying the above Theorem 1 to $-A$ instead of $A$.