Derivative of orthogonal matrix - Generalization of Frenet frame equations

2.6k Views Asked by At

I was studying Differential Geometry and the Frenet Frame equations

$\begin{pmatrix} T'\\ N'\\ B' \end{pmatrix} = \begin{pmatrix} 0 & k & 0\\ -k & 0 & \tau\\ 0 & -\tau & 0 \end{pmatrix} \begin{pmatrix} T\\ N\\ B \end{pmatrix}$

made me think about the general case:

Statement:

If $A(s), \space s\in I$ is an orthogonal $n \times n$ matrix with $C^{\infty}$ coefficients, then there exists an antisymmetric matrix $R(s)$ such that $A'(s)=R(s)A(s) \space \forall s \in I$.

Proof:

$A$ orthogonal $\Leftrightarrow$ $AA^t=I_n$

If we differentiate the above equation, Leibniz rule gives $A'A^t+A(A^t)'= \mathbf{0} \Rightarrow A'=(-A(A^t)')A.$

We set $R:=-A(A^t)'$ and we show that it is antisymmetrical:

$R^t=[-A(A^t)']^t=[-A(A')^t]^t=-A'A^t \Rightarrow $

$R+R^t = -(A(A^t)'+A'A^t) = -(AA^t)'=-(I_n)'=\mathbf{0} \space \square$

Is this proof complete? Have I missed something? Something bugs me about how we define $R$. Is it okay that $A'$ hides in there? I believe that as long as it is an existence problem, we don't care about the calculation of $R$, we just have to show that it exists, but I'm not entirely sure.

Thank you in advance!