Geometric interpretation for eigenvalues and eigenvectors of the cross product's representation as a linear map

764 Views Asked by At

Fix ${\bf x} = (x_1,x_2,x_3) \in \Bbb R^3\setminus\{{\bf 0}\}$. We can look at the cross product as a linear map ${\bf x}\times: \Bbb R^3 \to \Bbb R^3$ which is represented in the standard basis by $$\begin{bmatrix} 0 & -x_3 & x_2 \\ x_3 & 0 & -x_1 \\ -x_2 & x_1 & 0\end{bmatrix}.$$Also, it is easy to compute its charcteristic polynomial $p(t) = t(t^2 + \|{\bf x}\|^2)$. Then $0$ is an eigenvalue for which the associated eigenspace is the line spanned by ${\bf x}$ itself. But we can write $$p(t) = t(t-i\|{\bf x}\|)(t+i\|{\bf x}\|),$$and continue the analysis. Assuming I didn't screw up computations, I get that the a complex eigenvector associated to $i\|{\bf x}\|$ is $${\bf v} = \left(-x_1x_3 - x_2\|{\bf x}\|i, -x_2x_3+x_1\|{\bf x}\|i, x_1^2+x_2^2\right).$$We have $${\rm Re}({\bf v}) = (-x_1x_3,-x_2x_3,x_1^2+x_2^2) \quad\mbox{and}\quad {\rm Im}({\bf v}) = (-x_2\|{\bf x}\|,x_1\|{\bf x}\|,0).$$Then I noticed that: $${\bf x}\times {\rm Re}({\bf v}) = \|{\bf x}\|\,{\rm Im}({\bf v}) \quad\mbox{and}\quad {\bf x}\times {\rm Im}({\bf v}) = -\|{\bf x}\|\,{\rm Re}({\bf v}). $$

Even more, ${\rm Re}({\bf v})$ and ${\rm Im}({\bf v})$ are orthogonal. I am bewildered by my little discovery. However, I can't quite interpret this geometrically, and I guess that extra factor of $\|{\bf x}\|$ is related to the $i\|{\bf x}\|$ eigenvalue. Can someone explain what's behind these computations?


I just found this question, and I apologize for not searching well enough before asking - yet, there is no satisfactory answer there, and they didn't use real and imaginary parts of the eigenvectors like I pointed here - it might make something easier to see, so please don't vote to close as duplicate (yet?).

2

There are 2 best solutions below

0
On BEST ANSWER

I talked to some people outside the internet and one interpretation is as follows:

We have that if $T = {\bf x}\times$, then $T$ fixes the line spanned by ${\bf x}$, and since $T$ is anti-symmetric, $T$ leaves fixed the complement of that line: the plane normal to ${\bf x}$. The restriction of $T$ to that plane ${\bf x}^\perp$ works as a rotation of $90$ degrees and dilation by $\|{\bf x}\|$, so the eigenvalues are $\pm i\|{\bf x}\|$.

Rotations in the complex plane swap the real and imaginary parts, so that explains the fact that if ${\bf v}$ is an eigenvector (associated to $i\|{\bf x}\|$, say), we have $$T({\rm Re}({\bf v})) = \|{\bf x}\|{\rm Im}({\bf v})\quad\mbox{ and }\quad T({\rm Im}({\bf v})) = -\|{\bf x}\|{\rm Re}({\bf v}).$$

This sort of generalizes to $\Bbb R^n$: fix $n-2$ vectors ${\bf x}_1,\cdots,{\bf x}_{n-2}$ and look at the map $$T = {\bf x}_1 \times \cdots \times {\bf x}_{n-2}\colon \Bbb R^n \to \Bbb R^n$$We have $\ker T = {\rm span}\{{\bf x}_1,\cdots,{\bf x}_{n-2}\}$, and $T$ also leaves invariant the complement of that $(n-2)-$plane, which is a $2-$plane. In that $2-$plane, $T$ works as a rotation and dilation by some factor related to these fixed vectors.

0
On

The essentials are getting a bit lost in the coordinate forms you're using. There's no nice way to write this in coordinates because there are no canonical elements of the eigenspaces. It gets a lot clearer if you abstract from the coordinates.

Take any vector $\mathbf y$ orthogonal to $\mathbf x$. Then

$$ \mathbf y+\mathrm i\frac {\mathbf x}{\|\mathbf x\|}\times\mathbf y $$

is an eigenvector with eigenvalue $-\mathrm i\|\mathbf x\|$: When you apply the cross product, the first term yields $-\mathrm i\|\mathbf x\|$ times the second and vice versa.

If you want to see it in coordinates, choose the coordinate system such that $\mathbf x=(\|\mathbf x\|,0,0)$. Then the matrix is

$$ \|\mathbf x\|\pmatrix{0&0&0\\0&0&-1\\0&1&0} $$

and the eigenvectors are

$$ \pmatrix{1\\0\\0}\;,\;\pmatrix{0\\1\\\mathrm i}\;,\;\pmatrix{0\\1\\-\mathrm i}\;. $$

As was discussed in the comments, this is related to the fact that the skew-symmetric matrices are the generators of rotations. Denoting your matrix by $A_{\mathbf x}$ and assuming $\|\mathbf x\|=1$, we have

$$ \mathrm e^{\phi A_{\mathbf x}}\mathbf y=\mathbf y_\parallel+\cos\phi\,\mathbf y_\perp+\sin\phi\,\mathbf x\times\mathbf y_\perp\;, $$

where $\mathbf y_\parallel=\mathbf x\mathbf x^\top\mathbf y$ and $\mathbf y_\perp=\mathbf y-\mathbf y_\parallel$. Again we can make this explicit in coordinates in the case $\mathbf x=(1,0,0)$, where

$$ \exp\left(\phi\pmatrix{0&0&0\\0&0&-1\\0&1&0}\right)\pmatrix{x\\y\\z}=\pmatrix{1&0&0\\0&\cos\phi&-\sin\phi\\0&\sin\phi&\cos\phi}\pmatrix{x\\y\\z}\;. $$

You can think of

$$ \exp\left(\phi\pmatrix{0&-1\\1&0}\right)=\cos\phi\pmatrix{1&0\\0&1}+\sin\phi\pmatrix{0&-1\\1&0} $$

as an analogue of $\mathrm e^{\mathrm i\phi}=\cos\phi+\mathrm i\sin\phi$, with

$$ \pmatrix{0&-1\\1&0}^2=-\pmatrix{1&0\\0&1} $$

playing the role of $\mathrm i^2=-1$.