Which matrices can be realized as second derivatives of orthogonal paths?

186 Views Asked by At

$\newcommand{\skew}{\operatorname{skew}}$ $\newcommand{\sym}{\operatorname{sym}}$ $\newcommand{\SO}{\operatorname{SO}_n}$

I am interested to know which real matrices $A \in M_n$ can be realized as second derivatives of paths in $\text{SO}_n$ starting at the identity. That is, for which matrices $A$, there exist a smooth path $\alpha:(-\epsilon,\epsilon) \to \text{SO}_n$, such that $\alpha(0)=Id$ and $\ddot \alpha(0)=A$. We denote the space of realizable matrices by $D$.

Question: I prove below that $ (\skew)^2 \subseteq D \subseteq (\skew)^2+\skew $. Does $D=(\skew)^2+\skew$ always hold?

Comment: Note that $(\skew)^2+\skew \subsetneq M_n$, at least for odd $n$: In that case every skew-symmetric matrix is singular, so $(\skew)^2 \subseteq \sym $ consists only of singular matrices, hence does not contain all symmetric matrices.

Edit: I proved below that equality holds in dimension $n=2$.


Proof of $ (\skew)^2 \subseteq D \subseteq (\skew)^2+\skew $:

  1. Every square of skew-symmetric matrix can be realized: For skew $B$, take $\alpha(t)=e^{tB}$. Then, $\dot \alpha(t)=Be^{tB}$, $\ddot \alpha(t)=B^2e^{tB}$.

  2. The space of realizable matrices is contained in $(\skew)^2+\skew$: Indeed, since $\dot \alpha(t) \in T_{\alpha(t)}\SO=\alpha(t)\skew$, we have $\dot\alpha(t)=\alpha(t)B(t)$ for some $B(t) \in \skew$, so

$$\ddot \alpha(t)=\dot \alpha(t) B(t)+\alpha(t) \dot B(t)$$ hence $\ddot \alpha(0)=\dot \alpha(0) B(0)+ \dot B(0)= B(0)^2+\dot B(0) \in (\skew)^2 +\skew,$ where the last equality followed from $\dot \alpha(0)=B(0)$ (put $t=0$ in $\dot\alpha(t)=\alpha(t)B(t)$).

Edit 2: When trying to show the converse direction, I hit a wall: we need to show that there exist solutions $\dot\alpha(t)=\alpha(t)B(t)$, where $\alpha(t) \in \SO,B(t) \in \skew$, with arbitrary $B(0),\dot B(0) \in \skew$. A naive attempt would be to define $\alpha(t)=e^{\int_0^t B(s)ds}$ for $B(s)=B(0)+s\dot B(0)$. However, it is not true in general that $\alpha'(t)=\alpha(t)B(t)$; this happens if $B(t)$, $\int_0^t B(s)ds$ commute, which happens if and only if $B(0),\dot B(0)$ commute.


Proof $D = (\skew)^2+\skew$ for $n=2$:

$\alpha(t)$ can always be written as $\alpha(t)=\begin{pmatrix} c(\phi(t)) & s(\phi(t)) \\\ -s(\phi(t)) & c(\phi(t)) \end{pmatrix}$, where $c(x)=\cos x,s(x)=\sin x$, and $\phi(t)$ is some parametrization satisfying $\phi(0)=0$.

Differentiating $\alpha(t)$ twice, we get

$$ \ddot \alpha(t)=-(\phi'(t))^2\alpha(t)+\phi''(t)\begin{pmatrix} -s(\phi(t)) & c(\phi(t)) \\\ -c(\phi(t)) & -s(\phi(t)) \end{pmatrix},$$

so

$$ \ddot \alpha(0)=-(\phi'(0))^2Id+\phi''(0)\begin{pmatrix} 0 & 1 \\\ -1 & 0 \end{pmatrix}.$$

Since we can choose $\phi'(0),\phi''(0)$ as we wish, we conclude that $$ D=\mathbb{R}_{\le 0}Id+\mathbb{R}\begin{pmatrix} 0 & 1 \\\ -1 & 0 \end{pmatrix}=\mathbb{R}_{\le 0}Id+\skew.$$ Since $\skew=\text{span} \{ \begin{pmatrix} 0 & 1 \\\ -1 & 0 \end{pmatrix}\}$, and $\begin{pmatrix} 0 & 1 \\\ -1 & 0 \end{pmatrix}^2=-Id$, we have $\skew^2=\mathbb{R}_{\le 0}Id$, so indeed $D=(\skew)^2+\skew$.

2

There are 2 best solutions below

0
On BEST ANSWER

Yes. Given skew-symmetric matrices $B$ and $C,$ define $\alpha(t)=\exp(Bt+\tfrac12 Ct^2).$ Then \begin{align} \alpha(t) &=I+(Bt+\tfrac12 Ct^2)+\tfrac12 (Bt+\tfrac12 Ct^2)^2+O(t^3)\\ &=I+Bt+\tfrac12 (B^2+C)t^2+O(t^3) \end{align} as $t\to 0.$ This shows that $\ddot \alpha(0)=B^2+C.$

5
On

I believe that the construction you're looking for is called a Dyson Series (Wikipedia). In detail, suppose we are given $B,C \in \mathrm{Skew}(n)$ and we want to construct a $\gamma: (-\varepsilon,\varepsilon) \to \mathrm{SO}(n)$ such that $\gamma(0)=1$ and $\ddot{\gamma}(0)=B^2+C$. I claim that \begin{align*} \gamma(t) & :=\sum_{n=0}^{\infty} \left[\int_0^t \int_0^{t_0} \cdots \int_0^{t_{n-1}} \left(\prod_{k=0}^n (B+t_{n-k} C) \right) \mathrm{d} t_n \cdots \mathrm{d} t_0\right] \\ & = 1 + \int_0^t (B+t_0C) \mathrm{d}t_0 + \int_0^t \int_0^{t_0} (B+t_1C)(B+t_0C) \mathrm{d} t_1 \mathrm{d}t_0 + \\ & \hspace{1cm}\int_0^t \int_0^{t_0} \int_0^{t_1} (B+t_2C)(B+t_1C)(B+t_0C) \mathrm{d}t_2 \mathrm{d}t_1 \mathrm{d}t_0 + \cdots \end{align*} is a well-defined solution to the problem. Indeed, if we let $$m:=\max_{s \in [0,t]} \lVert B+sC \rVert_{L^2},$$ then we have that $$\lVert \gamma(t) \rVert_{L^2} \leq e^m,$$ so that $\gamma$ is defined by a convergent sequence. Moreover, we can compute that $\dot{\gamma}(t)=\gamma(t)(B+tC)$, evincing both that the image of $\gamma$ (which a priori lies in the space of $n \times n$ matrices) in fact lies in $\mathrm{SO}(n)$ and also that $\ddot{\gamma}(0)=B^2+C$.