How can skew-symmetric matrices be thought of as infinitesimal rotations?

5.7k Views Asked by At

I've recently stumbled upon the fact that skew-symmetric matrices represent somehow infinitesimal rotations. Having never encountered them, I looked them up and learnt they have to do with Lie algebras and groups, but this is beyond what I've studied so far.

Is it possible to have a more intuitive understanding of this?

Also, from Wikipedia:

skew-symmetric matrices are derivatives, while an actual infinitesimal rotation matrix has the form $I+Ad\theta$ where $d\theta$ is vanishingly small and $A ∈ so(3)$.

Having read this is about derivatives and has applications in physics, that "lonely" $d\theta$ is actually a bit suspicious. What about it?

3

There are 3 best solutions below

0
On BEST ANSWER

If you have a sufficiently smooth function $F$ from a real variable $t$ to real $n\times n$ matrices, then you can differentiate each element of the matrix with respect to $t$, and therefore give meaning to $F'(t)$.

Now if we know that $F(t)$ is always a rotation matrix and that $F(t_0)=I$, then it turns out that $F'(t_0)$ will always be skew-symmetric. And conversely, every skew-symmetric matrix will be the derivative of some $F$ that satisfies these conditions.

In this way we can consider the skew-symmetic $F'(t_0)$ to encode which way and how fast the new coordinates given by $F(t)$ rotate at time $t=t_0$.

Without the assumption that $F(t_0)=I$ we can still say that $F(t)^{-1}F'(t)$ and $F'(t)F(t)^{-1}$ are always skew-symmetric; these encode the instantaneous rotation at any time in two different ways.

0
On

If $r$ is a fixed unit vector that defines the rotation axis, then the matrix for a rotation through an angle $\theta$ is the exponential of the “cross-product matrix” of $\theta r$, which is skew-symmetric. (This is similar to the complex exponential $e^{i\theta}$.) The expression in your question is the first two terms of the Taylor series of this exponential, so it’s a linear approximation to the actual rotation.

You can also arrive at this approximation directly. A general rotation in $\mathbb R^3$ can be decomposed as $$R=Q^T\begin{bmatrix}\cos\theta & -\sin\theta & 0 \\ \sin\theta & \cos\theta & 0 \\ 0&0&1 \end{bmatrix}Q,$$ where $Q$ is some orthogonal matrix. For small angles $d\theta$, the central matrix is approximately $$\begin{bmatrix}1&-d\theta&0 \\ d\theta&1&0\\0&0&1\end{bmatrix} = I+d\theta\begin{bmatrix}0&-1&0\\1&0&0\\0&0&0\end{bmatrix}.$$ (The latter matrix is in fact the cross-product matrix of the unit $z$-vector.) Conjugation preserves skew-symmetry, so for small angles, $R\approx I+d\theta\,A$, where $A$ is some skew-symmetric matrix. The matrix $d\theta\,A$ thus computes the approximate amount by which the rotated vector is displaced in the plane of rotation.

3
On

I would like to say how bad the formula "infinitesimal rotation matrix" $=I+Ad\theta$ is.

Firstly the correct equality is $R(\theta)=R(0)+\theta A+O(\theta^2)$ where $R(0)=I$ and $A\in SK$ (a skew symmetric matrix).

Secondly, one must not believe that this type of formula is valid only when the angle varies; we can make it valid when the angle AND the axis vary and more generally, for $O(n,\mathbb{R})$, the group of orthogonal real $n\times n$ matrices.

Thirdly, it's simply the Taylor formula in $O(t^2)$. We will see below that this type of formula can be easily extended in $O(t^3)$.

Proposition. Let $t\in I\rightarrow R(t)\in O(n,\mathbb{R})$ be a $C^3$ function where $I\subset \mathbb{R}$ is a neighborhood of $0$. Then there are two skew symmetric matrices $K,L$ s.t.

$R(t)=(I+tK+t^2/2(K^2+L))R(0)+0(t^3).$

Proof. The beginning is very well-known. $RR^T=I$ implies that $R'R^T+RR'^T=0$, that is $R'R^T=K\in SK$ and finally $R'=KR$.

Second step. $R''R^T+2R'R'^T+RR''^T=0$, that implies $R"R^T+RR''^T-2K^2=0$.

Then $R''R^T-K^2=L\in SK$ and consequently $R''=(K^2+L)R$.

We finish by applying the Taylor formula

$R(t)=R(0)+tR'(0)+t^2/2R''(0)+O(t^3)$.

Remark. $O(n,\mathbb{R})$ is an algebraic real group that admits $SK$ as tangent vector space in $I$; therefore, $O(n,\mathbb{R})$ has dimension $n(n-1)/2$. Moreover, if $K\in SK$, then $t\rightarrow e^{tK}R$ is the geodesic of $O(n,\mathbb{R})$ through $R$ in the direction $KR$.