What are matrices that commute with a rotation matrix called?

1.3k Views Asked by At

Lets say I have the rotation matrix $R$ and the matrix $M$, where

\begin{equation} R = \begin{bmatrix} \cos(\theta) && -\sin(\theta) && 0\\ \sin(\theta) && \cos(\theta) && 0 \\ 0 && 0 && 1\end{bmatrix} \end{equation}

\begin{equation} M = \begin{bmatrix} M_1 && 0 && 0\\ 0 && M_1 && 0 \\ 0 && 0 && M_2\end{bmatrix} \end{equation}

When they are post multiplied with the matrix $M$ or pre multiplied, I got the following matrix:

\begin{equation} M R = RM = \begin{bmatrix} M_1 \cos(\theta) && -M_1 \sin(\theta) && 0\\ M_1 \sin(\theta) && M_1 \cos(\theta) && 0 \\ 0 && 0 && M2\end{bmatrix} \end{equation}

So they are equal when the matrix $M$ is diagonal and $M_{11} = M_{22}$. What this type of matrix called and are there other matrices that commute with rotational matrices?

Thank you

2

There are 2 best solutions below

0
On BEST ANSWER

$M$ can be written as a direct sum of $M_1$ times the 2x2 identity matrix in the upper left and $M_2$ times the 1x1 identity matrix in the lower left. $R$ can also be written as a direct sum with the same dimensions. If each pair of submatrices commute, then the complete matrices commute. The upper left matrix is a multiple of the identity matrix, and the identity matrix is a special case of $R$ where $\theta = 0$. Thus, the fact that $RM =MR$ is a special case of the fact that rotation matrices about the same axis commute with each other[1], and scaling doesn't affect commutation (or, even more generally, that the identity matrix commutes with everything). With more dimensions, we can have direct sums of several rotation matrices, and they will commute if the axes of rotations are disjoint. For instance,

\begin{equation} R_1 = \begin{bmatrix} \cos(\theta_1) && -\sin(\theta_1) && 0&&0\\ \sin(\theta_1) && \cos(\theta_1) && 0 &&0\\ 0 && 0 && 1&&0\\ 0 && 0 && 0&&1\end{bmatrix} \end{equation}

and

\begin{equation} R_2 = \begin{bmatrix}1 && 0 && 0&&0\\ 0 && 1 && 0&&0\\ 0&&0&&\cos(\theta_2) && -\sin(\theta_2) \\ 0 &&0 &&\sin(\theta_2) && \cos(\theta_2) \end{bmatrix} \end{equation}

commute because the two rotations don't interact with each other.

[1] "commutative" is a property held by an operator or space in total. When describing two matrices, the phrase is "commute with each other".

0
On

Answer for 3d rotations

In 3 dimensions a rotation commutes with another rotation with the same rotation axis, or a "stretch" along the rotation axis.

To write this formally, parameterize 3D rotations by an axis $\hat k$ and an angle $\theta$. $\hat k$ is a unit vector and $k$ is its length. call $\vec k = \theta \hat k$ the rotation vector. The rotation matrix $R$ can be written as $R=e^{[\vec k]_\times}$ where $[\vec k]_\times$ is the antisymmetric "cross" matrix

$[\vec k]_\times = \begin{bmatrix} 0 &-k_z&k_y\\ k_z&0&-k_x\\-k_y&k_x&0 \end{bmatrix} $

from this is is clear that $e^{[\theta_1 \hat k]_\times}$ commutes with $e^{[\theta_2 \hat k]_\times}$ for any $\theta_{1,2}$ .

Now consider matrices of the type $S=aI+b\hat k \hat k^T$. It is not hard to see that they commute with $R$, and the interpretation as stretch actions becomes clear if you write it as $S=a(I-\hat k \hat k^T) + (a+b)\hat k \hat k^T$, The first term is a stretch in directions orthogonal to $\hat k$ and the second term affects vectors parallel to $\hat k$.

To summarize, in 3D, matrices of the general form $e^{[\tau \hat k]_\times}(aI+b\hat k \hat k^T)$ commute with $R$