Is it true that every orthogonal transformation , even over $\mathbb R$, is diagonalizable? I didn't succeed to get any information about it. Could anyone explain please?
Is it true that every orthogonal transformation , even over $\mathbb R$, is diagonalizable?
2.7k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 5 best solutions below
On
$\begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}$.
This cannot be diagonalizable over the reals since the eigenvalues are complex.
On
The orthogonal transformations of $\mathbb R ^2$ are rotations and reflexions through a line. Do you know a vector which is invariant under rotations? (Ans: $v=O$).
On
Let me go a little farther. Let $n$ be a positive integer, and let $L: \mathbb{R}^n \rightarrow \mathbb{R}^n$ be a linear transformation.
1) Then $L$ has an invariant subspace of dimension either $1$ or $2$. The former occurs if and only if $L$ has a real eigenvalue.
Proof: It is immediate that existence of real eigenvalues is equivalent to existence of $1$-dimensional invariant subspaces. So suppose $L$ has no real eigenvalue. Then, since the complex eigenvalues are roots of the real polnomial $\chi(t) = \operatorname{det} (t\cdot 1_{\mathbb{R^n}}-L)$, for a complex eigenvalue $\lambda$ (which exists by the Fundamental Theorem of Algebra), also the complex conjugate $\overline{\lambda}$ is an eigenvalue, and if $v \in \mathbb{C}^n$ is such that $Lv = \lambda v$, then taking complex conjugates and using $L = \overline{L}$ we get
$L \overline{v} = \overline{\lambda} \overline{v}$.
Then
$$V = \operatorname{span}_{\mathbb{R}} \{ \frac{v+\overline{v}}{2}, \frac{v-\overline{v}}{2i} \}$$
is a two-dimensional (real!) invariant subspace of $\mathbb{R}^n$.
[Remark aside: if you are willing to assume the Real Fundamental Theorem of Algebra -- namely that every nonconstant real polynomial has a factor of degree one or two -- then you can establish the above fact without using complex numbers, considering factors of the local minimal polynomial ($=$ the annihilator polynomial) of any nonzero $v \in \mathbb{R}^n$. I have recently been kicking around whether it is possible to show RFTA using real linear algebra only. So far no luck.]
2) If $L$ is orthogonal (respectively, symmetric), then for any invariant subspace $V \subset \mathbb{R}^n$, also the orthogonal complement $V^{\perp} = \{w \in \mathbb{R^n} \mid w \cdot v = 0 \forall v \in V\}$ is an invariant subspace, and $L|_{V^{\perp}}$ remains orthogonal (respectively, symmetric).
From 1) and 2) we see that if $L$ is orthogonal (or symmetric), then $\mathbb{R}^n$ is an orthogonal direct sum of subspaces of dimension $1$ or $2$. In the symmetric case is turns out that one can go farther and take the subspaces to be of dimension $1$ (this is the Real Spectral Theorem). A little thought shows that this is equivalent to:
3) If $L$ is symmetric, then all the complex eigenvalues of $L$ are actually real numbers. (Or more gracefully: $\chi(t)$ splits into linear factors over $\mathbb{R}.)
Once again there is the standard proof involving showing that complex eigenvalues of a symmetric matrix must be real, and there is also a totally real proof using RFTA. (I have to run shortly, so I omit these details for now. The first proof is of course very standard: it appears in most linear algebra books.)
Now let us look at the case in which $L$ is orthogonal. Then:
4) The complex eigenvalues of an orthogonal matrix all lie on the unit circle. In particular the only possible real eigenvalues are $\pm 1$.
(Again I omit the standard proof for now.)
Thus a real orthogonal transformation is diagonalizable if and only if it is the direct sum of its $1$-eigenspace and its $-1$-eigenspace. This is a strong condition: in particular its trace must be an integer. Let's look more closely at the $n = 2$ case, since the key question is when a $2$-dimensional invariant subspace has an eigenvalue.
Since $L$ is orthogonal, $L((1,0))$ and $L((0,1))$ form an orthonormal basis for $\mathbb{R}^2$: here this means that both are unit vectors and that their dot product is zero. We can write any unit vector as $(\cos \theta, \sin \theta)$ and thus take $L((1,0))$ to be in this form, and then there are two perpendicular unit vectors: $(-\sin \theta, \cos \theta)$ and $(\sin \theta,-\cos \theta)$, leading to the two standard matrices
$R_{\theta} = \left[ \begin{array}{cc} \cos \theta & - \sin \theta \\ \sin \theta & \cos \theta \end{array} \right]$
and
$T_{\theta} = \left[ \begin{array}{cc} \cos \theta & \sin \theta \\ \sin \theta & -\cos \theta \end{array} \right]$.
In the first case the characteristic polynomial is $t^2 - 2 \cos \theta t + 1$, of discriminant $4(\cos^2 \theta -1)$. Thus we have real eigenvalues if and only if $\cos \theta = \pm 1$; these values correspond to the scalar matrices $\pm 1$. For all other values we do not have real eigenvalues, so we see that a two-dimensional orthogonal transformation with determinant $1$ almost never has a real eigenvalue. However, because we do not have real roots, the quadratic formula guarantees us that there \emph{distinct} eigenvalues $\lambda \neq \overline{\lambda} \in \mathbb{C}$ and thus $R_{\theta}$ is diagonalizable over $\mathbb{C}$.
In the second case the characteristic polynomial is $t^2-1 = (t+1)(t-1)$, so indeed we have real eigenvalues. More specifically $T_{\theta}$ is the the matrix of orthogonal reflection through $(\cos \theta,\sin \theta)$.
In summary we have proved:
Every real orthogonal linear transformation $L$ is orthogonally equivalent to a block diagonal matrix consisting of $1 \times 1 $ blocks $[1]$, $1 \times 1$ blocks $[-1]$, and/or $2 \times 2$ blocks $R_{\theta}$ for $\theta \in (0,2\pi) \setminus \pi$. $L$ is diagonalizable (equivalently, orthogonally diagonalizable) over $\mathbb{R}$ if and only if no $2 \times 2$ blocks appear. In all cases $L$ is diagonalizable over $\mathbb{C}$.
It seems to me that the above result is a close relative of the Real Spectral Theorem. Since orthogonal matrices appear as prominently or more so in introductory linear algebra than symmetric matrices, I am surprised not to find it in most of the standard texts.
No, take $\begin{bmatrix} 0 & 1\\ -1 & 0\end{bmatrix}$ or, for a $n\times n$ matrix with $n\ge 3$, take $\begin{bmatrix} 0 & 1 & 0 & \dots & 0\\ -1 & 0 & 0 &\ldots & 0\\ 0 & 0 & 1 & \dots & 0\\ 0 & 0 & 0 &\ddots & \vdots\\ 0 & 0 & 0 & \dots & 1 \end{bmatrix}$.
Both of these matrices have $i$ and $-i$ as eigenvalues and hence aren't diagonalizable over $\mathbb R$.
However any real orthogonal matrix is diagonalizable over $\mathbb C$.