Is it possible to simplify expression with cross product?

103 Views Asked by At

So I have an expression $$ J^{-1} \left( w \times (J w) \right) $$ where $J$ is a 3x3 invertible matrix and $w$ is 3 element vector and $\times$ is cross product.

Is it the same as $$ (J^{-1} w) \times (Jw)? $$

And how can it be simplified?

Edit 1:

It is not. As was pointed out by @PrincessEev

Edit 2:

It can be added that J is actually has physical meaning as an inertia tensor. And as being such it is positive definite symmetric. Not only that. There is coordinate transformation (through rotation) that makes it diagonal. And $w$ is an angular velocity so we can express it in coordinate frame which makes inertia tensor diagonal. So initial expression could be simplified to:

$$ diag(v)^{-1} \left( w \times (diag(v) \cdot w) \right) $$

Hope this gives some more room for simplification...

Edit 3:

To give some more context. The task is about controlling angular motion of rigid body. And equation defines angular acceleration of rigid body under 3D rotation. And this generates nonlinear MIMO system with cross coupled control channels. That gives some challenges both from stability analysis and from control synthesis perspective. The only applicable analysis method I'm familiar with for such type of system is Lyapunov function search. Which does not tells much about stability margins or sensitivity or disturbance rejection. I'm trying to figure out if it is possible to decouple states or localize nonlinearities to make structured singular value methods applicable. Or somehow simplify linearization to make all of the linear control analysis applicable to some extend...

1

There are 1 best solutions below

1
On

In principle, no, and this should be immediate from thinking of it geometrically.

$J^{-1} (w \times Jw)$ generates a vector $x$ perpendicular to both $w$ and $Jw$, and then transforms it through $J^{-1}$, whereas $(J^{-1} w) \times (Jw)$ generates a vector $y$ perpendicular to both $J^{-1} w$ and $Jw$. For these to be equal, then we need that $J^{-1} x = y$, so that $x=Jy$, so that generating a vector perpendicular to $J^{-1} w$ and $Jw$, when transformed under $J$, is perpendicular to $w$ and $Jw$. That is, $J$ sends a vector ($y$) perpendicular to $J^{-1} w,Jw$ to such a one ($x$) for $w,Jw$. This feels like a bold claim on the face of it.

For example, take

$$J := \begin{bmatrix} 2 & 1 & 1 \\ 1 & 3 & 1 \\ 1 & 1 & 4 \end{bmatrix} \qquad w := \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}$$

Nothing special was done for these, other than ensuring $J$ is symmetric positive-definite (since it has all eigenvalues strictly positive) and otherwise picking random numbers.

Performing the computation, one sees $$ J^{-1}(w \times Jw) = \begin{bmatrix} -10/17 \\ 46/17 \\ -26/17 \end{bmatrix} \qquad (J^{-1} w) \times (Jw) = \begin{bmatrix} -3/17 \\ 89/17 \\ -65/17 \end{bmatrix} $$