How to solve $\dot{\mathbf{X}} = \mathbf{W}\times \mathbf{X}$ (3 dimensions)?

53 Views Asked by At

Given some non zero constant vector $\mathbf{W}$, how do I solve:

$$ \dot{\mathbf{X}} = \mathbf{W}\times \mathbf{X} $$

By imagining the movement of $\mathbf{X}$ I think the solution should be a cosine / sinus, but I can't find a good explanation for this, and perhaps the $\mathbf{W}$ will affect the coefficients of the cosines of $\mathbf{X}$.

I tried going through the definition of a cross product via indices and I wrote:

$$ \dot{X}^i = \epsilon_{mki}W^m X^k $$

But I don't know how to solve this either.

2

There are 2 best solutions below

0
On BEST ANSWER

Your idea using indices will help us solve this equation. If we write it out in matrix form, we know that $$\dot{\mathbf{X}}=\left(\begin{matrix}0 &-W^3&W^2 \\ W^3&0&-W^1\\ -W^2 & W^1&0\end{matrix}\right)\mathbf{X}\equiv \mathbf{M}\mathbf{X}.$$

This can be solved using regular ODE methods, by finding the eigenvalues $m_i$ and eigenvectors $M_i$ of $\mathbf{M}$. One of the eigenvectors is clearly $M_1=\mathbf{W}$, with eigenvalue $0$. The other two eigenvalues are $\pm i\sqrt{(W^1)^2+(W^2)^2+(W^3)^2}$, and you can calculate the eigenvectors $M_2$ and $M_3$ yourself. So if you break up your initial vector into a linear combination of the eigenvectors, you will indeed see sinusoidal time evolution of the components perpendicular to $\mathbf{W}$: an initial vector $$\mathbf{X}(0)=p_1 M_1+p_2 M_2+ p_3 M_3=p_1 \mathbf{W}+p_2 M_2+ p_3 M_3$$evolves into the vector $$\mathbf{X}(t)=p_1 \mathbf{W}+p_2 M_2 e^{i t \sqrt{(W^1)^2+(W^2)^2+(W^3)^2}}+ p_3 M_3 e^{-i t \sqrt{(W^1)^2+(W^2)^2+(W^3)^2}}.$$

Edit: In summary, the vector $\mathbf{X}$ will precess around the direction given by $\mathbf{W}$ at frequency $|\mathbf{W}|=\sqrt{(W^1)^2+(W^2)^2+(W^3)^2}$.

0
On

Write out the system you have hidden behind that tensor notation.

\begin{align*} \dot{x}_1 &= \phantom{w_0 x_1} -w_3 x_2 + w_2 x_3 \\ \dot{x}_2 &= w_3 x_1 \phantom{{}-w_0 x_2} - w_1 x_3 \\ \dot{x}_3 &= -w_2 x_1 + w_1 x_2 \end{align*}

So $$ \dot{X} = \begin{pmatrix} 0 & -1 & 1 \\ 1 & 0 & -1 \\ -1 & 1 & 0 \end{pmatrix} X \text{.} $$

Then the usual ODE method of diagonalization and extraction of eigenvectors will get solutions. (Random example.)

For instance, where $C$ is a (column) vector of three arbitrary constants of integration, $$ x_1(t) = \frac{1}{|W|^2} \left( w_1(W \cdot C) + (w_2^2+w_3^2, -w_1 w_2, -w_1 w_3) \cdot C) \cos(t |W|) + |W|(0, -w_3, w_2)\cdot C \sin(t |W|) \right) $$

The solutions for $x_2$ and $x_3$ are similar.