Are orthonormal matrices rotations?

279 Views Asked by At

If we take a an orthonormal in $\mathbb{R}^{2\times 2}$, we know it has to be of the form

$$A =\begin{pmatrix} a & b\\ -b & a\end{pmatrix}$$ such that $$a^2+b^2=1$$

(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $\vartheta$ such that $a=\cos\vartheta$, $b=\sin\vartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.

I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $\mathbb{R}^{n\times n}$ such that it can be written as $$A = \sum_{i=1}^{d\leq n}R_i,$$ where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.

2

There are 2 best solutions below

3
On BEST ANSWER

In $n$-dimensional case, it could be shown that such orthogonal matrices $\boldsymbol A$ are similar to a block-diagonal matrices $$ \begin{bmatrix} \boldsymbol R_1 & & & & & \\ & \boldsymbol R_2 &&&&\\ && \boldsymbol R_3 &&& \\ &&& \ddots &&\\ &&&&\boldsymbol R_k &\\ &&&&& \boldsymbol I_{n-2k} \end{bmatrix} $$ when $\det(\boldsymbol A) =1$, or $$\begin{bmatrix} \boldsymbol R_1 & & & & & \\ & \boldsymbol R_2 &&&&\\ && \ddots &&& \\ &&&\boldsymbol R_k &&\\ &&&& \boldsymbol I_{n-2k-1} & \\ &&&&& -1 \end{bmatrix} $$ when $\det(\boldsymbol A) =-1$. Here $$ \boldsymbol R_j = \begin{bmatrix} \cos(\varphi_j) & -\sin (\varphi_j)\\ \sin(\varphi_j) & \cos(\varphi_j) \end{bmatrix} \quad [j = 1, \ldots, k], $$ and $\boldsymbol I_m$ is an $m \times m $ identity matrix.

Hence such a decomposition exists.

Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]

0
On

Since an orthogonal matrix is normal, it is diagonalizable over $\mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks

$$ \pmatrix{\mathrm e^{\mathrm i\phi}&0\\0&\mathrm e^{-\mathrm i\phi}} $$

can be transformed to

$$ \pmatrix{\cos\phi&-\sin\phi\\\sin\phi&\cos\phi}\;. $$

Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.

An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.