Can matrices do cross product with vectors?

5.2k Views Asked by At

this is a practical problem raised from my chemistry research, apologize if somewhere I used incorrect words.

I'm dealing with the derivatives of a vector with respect to another vector. After some searching, I found the rule for differentiating a column vector with respect to a row vector is (all these vectors are 3 dimensional in Cartesian):

$$ \mathbf{e}=\begin{bmatrix}e_1\\e_2\\e_3\end{bmatrix} $$ $$ \mathbf{P}^T=\begin{bmatrix}P_1 & P_2 & P_3 \end{bmatrix} $$ $$ \frac{\partial \mathbf{e}}{\partial \mathbf{P}^T}= \begin{bmatrix} \frac{\partial e_1}{\partial P_1} & \frac{\partial e_1}{\partial P_2} & \frac{\partial e_1}{\partial P_3} \\ \frac{\partial e_2}{\partial P_1} & \frac{\partial e_2}{\partial P_2} & \frac{\partial e_2}{\partial P_3} \\ \frac{\partial e_3}{\partial P_1} & \frac{\partial e_3}{\partial P_2} & \frac{\partial e_3}{\partial P_3} \\ \end{bmatrix} $$

Here my $\mathbf{e}$ is determined from a cross product, that is, $\mathbf{e}=\mathbf{n}\times\mathbf{m}$. So I tried to do this: $$ \frac{\partial \mathbf{n}\times \mathbf{m}}{\partial \mathbf{P}^T} $$

From wiki, it says the product rule can be applied, so I arrived here: $$ \frac{\partial \mathbf{n}\times \mathbf{m}}{\partial \mathbf{P}^T}=\frac{\partial \mathbf{n}}{\partial \mathbf{P}^T}\times\mathbf{m}+\mathbf{n}\times\frac{\partial \mathbf{m}}{\partial \mathbf{P}^T} $$ Now I'm confused because $\frac{\partial \mathbf{n}}{\partial \mathbf{P}^T}$ and $\frac{\partial \mathbf{m}}{\partial \mathbf{P}^T}$ are matrices, and they do cross product with vectors.

I didn't found anywhere saying the cross product of matrix, so I'm concerned perhaps there is something wrong in the derivation.

Hope you can give some help. Thanks in advance!

2

There are 2 best solutions below

2
On BEST ANSWER

The writing is a bit awkward. If you take the derivative with respect to $P_i$ then $$ \frac{\partial n\times m}{\partial P_i} = \frac{\partial n}{\partial P_i}\times m + n\times \frac{\partial m}{\partial P_i} $$ makes more sense. If you want to give a meaning to $\frac{\partial n}{\partial P}\times m$ then it should be that the cross product acts column wise, i.e. that you take each column vector in $\frac{\partial n}{\partial P}$ and replace it by its cross product with m (which is equivalent to the above formula).

0
On

$\def\e{\varepsilon}\def\p#1#2{\frac{\partial #1}{\partial #2}}$ The Levi-Civita symbol $(\e_{ijk})$ can be used to write the cross product of two vectors as $$\eqalign{ a\times b &= -a\cdot\e\cdot b \\ &= +b\cdot\e\cdot a \;\;=\; -b\times a }$$ Replacing either vector with a matrix effectively defines the cross product for matrices, e.g. $$\eqalign{ A\times b &= -A\cdot\e\cdot b \\ a\times B &= -a\cdot\e\cdot B \\ A\times B &= -A\cdot\e\cdot B \\ }$$ Having defined the matrix cross product, the naive product rule $$\eqalign{ \p{(a\times b)}{p} \;\ne\; a\times\left(\p{b}{p}\right) \;+\; \left(\p{a}{p}\right)\times b \\ }$$ is still incorrect $\,\big({\rm NB}:$ It is correct when $p$ is a scalar instead of a vector$\big).$

However, the following rule is compatible with the matrix cross product $$\eqalign{ \p{(a\times b)}{p} \;=\; a\times\left(\p{b}{p}\right) \;-\; b\times\left(\p{a}{p}\right) \\\\ }$$ Another approach is to define a skew matrix associated with each vector $$\eqalign{ {\cal A} &\doteq {\rm Skew}(a) \;=\; (a\times I) \;=\; -a\cdot\e \\ {\cal B} &\doteq {\rm Skew}(b) \;=\; (b\times I) \;=\; -b\cdot\e \\ }$$ Then the cross product can be replace by a normal matrix-vector product $$\eqalign{ a\times b \;=\; {\cal A}b \;=\; -{\cal B}a \\ }$$ and the gradient can use standard matrix-matrix products $$\eqalign{ \p{(a\times b)}{p} &= {\cal A}\left(\p{b}{p}\right) \;-\; {\cal B}\left(\p{a}{p}\right)\\ }$$ $$\eqalign{ }$$