how to differentiate skew symmetric matrix $[\mathbf{v}]_{\times}$ with respect to $\mathbf{v}$

849 Views Asked by At

can anyone explain how to differentiate a skew-symmetric matrix $\mathbf{v}_{\times}$ with respect to $\mathbf{v}$ i.e. $\frac{\partial{[\mathbf{v}]_{\times}}}{\partial \mathbf{v}}$ where $\mathbf{v}\in\mathbb{R}^3$?

In addition, how to derive $\frac{\partial [\mathbf{v}]^2_{\times}}{\partial \mathbf{v}}$?

Thank you!

2

There are 2 best solutions below

3
On BEST ANSWER

Using the Levi-Civita (aka Permutation) tensor $\varepsilon$ the problem has a straightforward approach $$\eqalign{ [v]_\times &= -\varepsilon\cdot v \cr\cr d[v]_\times &= -\varepsilon\cdot dv \cr\cr \frac{\partial [v]_\times}{\partial v} &= -\varepsilon \cr \cr }$$ This is the same as what you derived, but the notation is more standard.


Update
Thinking about the second part of your question, your initial expansion is incorrect $$\eqalign{ [v]^2_\times &= vv^T - (v^Tv)\,I \,\,\,\neq vv^T \cr\cr }$$ Update #2
Here's a partial result for the question that you linked in your comment. Given the vector $w$ and the cross-matrix $$W=[w]_\times$$ you generate a rotation matrix $$R=\exp W$$ which you'd like to differentiate.

First, define a scalar $\theta$ representing the length of vector and take its differential $$\eqalign{ \theta^2 &= w\cdot w = \frac{1}{2}W:W \cr 2\theta\,d\theta &= W:dW \cr d\theta &= \frac{W:dW}{2\theta} \cr \cr }$$ Next expand $R$ via Rodrigues' formula $$\eqalign{ R &= I + \frac{\sin\theta}{\theta}W + \frac{1-\cos\theta}{\theta^2}W^2 \cr &= I + \alpha W + \beta W^2 \cr \cr }$$ Now let me denote the $4^{th}$ order isotropic tensor by ${\mathcal A}$ with components $${\mathcal A}_{ijkl}=\delta_{ik}\,\delta_{jl}$$ and the dyadic ($\star$) product $C = A\star B$ with components $$C_{ijkl} = A_{ij}\,B_{kl}$$ and a colon to denote the double-dot product, i.e. $$A:B = A_{ijkl}\,B_{klmn}$$


Finally we're ready to differentiate the rotation matrix $$\eqalign{ dR &= \alpha dW + \beta(W\cdot dW + dW\cdot W) + W\alpha^\prime d\theta + W^2\beta^\prime d\theta \cr &= \Big[\alpha{\mathcal A} + \beta(W\cdot{\mathcal A} + {\mathcal A}\cdot W^T) + \frac{\alpha^\prime}{2\theta}\,W\star W + \frac{\beta^\prime}{2\theta}\,W^2\star W\Big]:dW \cr &= -\Big[\alpha{\mathcal A} + \beta(W\cdot{\mathcal A} + {\mathcal A}\cdot W^T) + \Big(\frac{\alpha^\prime W}{2\theta}+\frac{\beta^\prime W^2}{2\theta}\Big)\star W\Big]:\varepsilon\cdot dw \cr\cr \frac{\partial R}{\partial w} &= -\Big[\alpha{\mathcal A} + \beta(W\cdot{\mathcal A} + {\mathcal A}\cdot W^T) + \Big(\frac{\alpha^\prime W}{2\theta}+\frac{\beta^\prime W^2}{2\theta}\Big)\star W\Big]:\varepsilon \cr }$$where $$\alpha^\prime=\frac{d\alpha}{d\theta},\,\,\,\,\beta^\prime=\frac{d\beta}{d\theta}$$


You can rearrange this result to better suit your tastes if you keep in mind a few things.

${\mathcal A}$ is the identity for the double-dot product $${\mathcal A}:X = X:{\mathcal A} = X$$ and $$W:\varepsilon = \varepsilon:W = -2\,w$$


Update #3
Looking more closely at your linked question, you appear to be interested in functions of the $W$ matrix of the form $$\eqalign{ F &= I + \alpha W + \beta W^2 \cr \alpha &= \alpha(\theta),\,\,\,\beta = \beta(\theta) \cr }$$ The case $F=e^W$ was analyzed above. The good news is that all of that analysis carries over to the case $F=\frac{e^W-I}{W}\,\,$ just using different functions for the scalar coefficients $$\eqalign{ \alpha &= \frac{1-\cos(\theta)}{\theta^2} \cr \beta &= \frac{\theta-\sin(\theta)}{\theta^3} \cr }$$ Therefore
$$\eqalign{ \frac{\partial F}{\partial w} &= -\Big[\alpha{\mathcal A} + \beta(W\cdot{\mathcal A} + {\mathcal A}\cdot W^T) + \Big(\frac{\alpha^\prime W}{2\theta}+\frac{\beta^\prime W^2}{2\theta}\Big)\star W\Big]:\varepsilon \cr \cr &= \Big(\frac{\alpha^\prime W}{\theta}+\frac{\beta^\prime W^2}{\theta}\Big)\star w - \alpha\,\varepsilon - \beta\,W\cdot\varepsilon - (\beta{\mathcal A}\cdot W^T):\varepsilon \cr }$$

0
On

I derive a solution using the notations of linear algebra: $$ \frac{\partial [\mathbf{v}]_\times}{\partial \mathbf{v}} = \frac{\partial \begin{bmatrix} 0 & -v_3 & v_2\\ v_3 & 0 & -v_1 \\ -v_2 & v_1 & 0 \end{bmatrix}}{\partial \begin{bmatrix}v_1\\v_2\\v_3\end{bmatrix}} $$ using $v_1$ as an example: $$ \frac{\partial [\mathbf{v}]_\times}{\partial v_1} = \begin{bmatrix} 0 & 0 & 0\\ 0 & 0 & -1 \\ 0 & 1 & 0 \end{bmatrix} = [e_1]_\times $$ where $e_1$ is the first column of an identity matrix $\mathbf{I} \in \mathbb{R}^{3\times3}$. Then, we have: $$ \frac{\partial [\mathbf{v}]_\times}{\partial \mathbf{v}} = \begin{bmatrix} [e_1]_\times\\ [e_2]_\times\\ [e_3]_\times \end{bmatrix} $$

since we have $[\mathbf{v}]_\times^2=\mathbf{v}\mathbf{v}^\top$, we could derive that (thank to greg's help, I made the following correction.) $$ \frac{\partial [\mathbf{v}]_\times^2}{\partial v_i} = \frac{\partial \mathbf{v}\mathbf{v}^\top-\mathbf{v}^\top\mathbf{v}\mathbf{I}}{\partial v_i} = \frac{\partial \mathbf{v}}{\partial v_i}\mathbf{v}^\top + \mathbf{v} [\frac{\partial \mathbf{v}}{\partial v_i}]^\top-\frac{\partial (\mathbf{v}^\top\mathbf{v})}{\partial v_i}\mathbf{I}=e_i\mathbf{v}^\top+\mathbf{v}e_i^\top-2v_i\mathbf{I} $$ where $e_i$ is the i-th column of the $\mathbf{I}$. Correct me if anyone see any problem with this.