Geometric interpretation for complex eigenvectors of a 2×2 rotation matrix

51.8k Views Asked by At

The rotation matrix $$\pmatrix{ \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta}$$ has complex eigenvalues $\{e^{\pm i\theta}\}$ corresponding to eigenvectors $\pmatrix{1 \\i}$ and $\pmatrix{1 \\ -i}$. The real eigenvector of a 3d rotation matrix has a natural interpretation as the axis of rotation. Is there a nice geometric interpretation of the eigenvectors of the $2 \times 2$ matrix?

4

There are 4 best solutions below

6
On BEST ANSWER

Lovely question!

There is a kind of intuitive way to view the eigenvalues and eigenvectors, and it ties in with geometric ideas as well (without resorting to four dimensions!).

The matrix, is unitary (more specifically, it is real so it is called orthogonal) and so there is an orthogonal basis of eigenvectors. Here, as you noted, it is $\pmatrix{1 \\i}$ and $\pmatrix{1 \\ -i}$, let us call them $v_1$ and $v_2$, that form a basis of $\mathbb{C^2}$, and so we can write any element of $\mathbb{R^2}$ in terms of $v_1$ and $v_2$ as well, since $\mathbb{R^2}$ is a subset of $\mathbb{C^2}$. (And we normally think of rotations as occurring in $\mathbb{R^2}$! Please note that $\mathbb{C^2}$ is a two-dimensional vector space with components in $\mathbb{C}$ and need not be considered as four-dimensional, with components in $\mathbb{R}$.)

We can then represent any vector in $\mathbb{R^2}$ uniquely as a linear combination of these two vectors $x = \lambda_1 v_1 + \lambda_2v_2$, with $\lambda_i \in \mathbb{C}$. So if we call the linear map that the matrix represents $R$

$$R(x) = R(\lambda_1 v_1 + \lambda_2v_2) = \lambda_1 R(v_1) + \lambda_2R(v_2) = e^{i\theta}\lambda_1 (v_1) + e^{-i\theta}\lambda_2(v_2) $$

In other words, when working in the basis ${v_1,v_2}$: $$R \pmatrix{\lambda_1 \\\lambda_2} = \pmatrix{e^{i\theta}\lambda_1 \\ e^{-i\theta}\lambda_2}$$

And we know that multiplying a complex number by $e^{i\theta}$ is an anticlockwise rotation by theta. So the rotation of a vector when represented by the basis ${v_1,v_2}$ is the same as just rotating the individual components of the vector in the complex plane!

0
On

The simplest answer to your question is perhaps yes. The eigenvectors of a genuinely complex eigenvalue are necessarily complex. Therefore, there is no real vector which is an eigenvector of the matrix. Ignoring of course the nice cases $\theta=0, \pi$ the rotation always does more than just rescale a vector.

On the other hand, if we view the matrix as a rotation on $\mathbb{C}^2$ then the eigenvectors you give show the directions in which the matrix acts as a rescaling operator in the complex space $\mathbb{C}^2$. I hope someone has a better answer, I would like to visualize complex two-space.

0
On

One can also directly show that a real matrix $A$ with complex eigenvalues describes a rotation (possibly followed by a rescaling) in the basis of the real and imaginary parts of the corresponding eigenvectors. More precisely:

  1. Observe that if $A$ is real, and has a complex (non-real) eigenvalue $z\in\mathbb C\setminus\mathbb R$, with eigenvector $\mathbf v$, then $\mathbf v$ must also be complex (and not real). Furthermore, $\bar z$ must also be an eigenvalue, with eigenvector $\bar{\mathbf{v}}$ (complex conjugate of $\mathbf v$).

  2. Observe that $A\mathbf v=z\mathbf v$, decomposed in its real and imaginary components, corresponds to $$\begin{cases} A \mathbf v_R &= z_R \mathbf v_R - z_I \mathbf v_I, \\ A \mathbf v_I &= z_I \mathbf v_R + z_R \mathbf v_I. \end{cases}$$ Observe the similarity between this and the expression for a 2x2 rotation. Furthermore, write the polar decomposition of the eigenvalue as $z=r e^{i\theta}$. Then the above becomes $$\begin{cases} A \mathbf v_R &= r[\cos(\theta) \mathbf v_R - \sin(\theta) \mathbf v_I], \\ A \mathbf v_I &= r[\sin(\theta) \mathbf v_R + \cos(\theta) \mathbf v_I]. \end{cases}$$ This is clearly a rotation matrix (up to the scaling factor). Or rather, it would be, if $\mathbf v_R$ and $\mathbf v_I$ were orthogonal (as real vectors). When this is not the case, the matrix will act, in the original coordinates, as a rotation followed by a rescaling.

In conclusion, the real and imaginary parts of the eigenvector corresponding to a complex eigenvalue span a two-dimensional plane which $A$ leaves invariant, and $A$ acts as a two-dimensional rotation matrix with respect to some basis chosen in this plane. More explicitly, we are saying that $$A = r P R(\theta) P^{-1},$$ where $R(\theta)\in\mathbf{SO}(2)$ represents the rotation by an angle $\theta$, and $P$ is the matrix whose columns are $\mathbf v_R$ and $\mathbf v_I$.

0
On

Any 2D rotation matrix $R$ is defined by how much it rotates any vector in the plane counter-clockwise by $\theta$. Here is the geometry of the eigen-vectors of $R$. Then, an extension to 3D. No proofs here and no allusion to theorems about diagonalization and so on.

We can construct an eigen-vector $v$ of $R$ from any real vector $v_r$ in the plane. How? Define a 2nd vector $v_i$ that is a 90 degree rotation counter-clockwise of $v_r$. Then the vector $v$,

$$v = v_r + i v_i$$

is an eigenvector of $r$ with associated eigenvalue $\lambda$,

$$\lambda = \cos(\theta) - i\sin(\theta).$$

As you can see from the construction, $R$ has lots of eigen-vectors.

Another eigen-vector, with a different eigent eigen-value, can be got the same way, again using any vector in the plane $v_r$, not necessarily the same as the one above, and, this time a 90 degree clockwise rotation of $v_r$, $v_i$. In this case, again

$$v = v_r + i v_i$$

and the associated eigenvalue is now

$$\lambda = \cos(\theta) + i\sin(\theta).$$

All eigenvectors of $R$ have one of the two forms above. If someone asks you for the eigen-vectors of a rotation matrix $R$ you can get a canonical pair, $v^1$ and $v^2$ by picking the first $v_r$ to be a unit vector parallel to one of the coordinate axis and it's associate $v_i$ parallel (or antiparallel) to the other axis. Then, recycle $v_r$ and $v_i$ for the other eigenvector. Thus you get a nicely matched pair:

$$v^1 = v_r + i v_i$$

with eigen-value $\cos(\theta) - i\sin(\theta)$, and

$$v^1 = v_r - i v_i$$

with eigen-value $\cos(\theta) + i\sin(\theta)$.

This also works in 3D. Pick all of the real vectors in the discussion above to be in the plane orthogonal to the axis of rotation and only consider rotations about the axis of rotation, CW and CCW. That is, pick any real vector $v_r$ in the plane orthogonal to the axis of rotation. Also consider the 90 degree rotation $v_i$ of that vector about the axis, using the right-hand rule. Then you get an eigenvector $v$ of $R$:

$$v = v_r + i v_i.$$

As for the 2D case, in 3D you can get another eigen-vector using any other vector in the plane orthogonal to the axis, and its left-hand rotation about the axis by 90 degrees. So as to get a unique answer, Matlab starts with $v_i$ and makes it a real unit vector on the plane orthogonal to the axis of rotation that also has zero $y$ component, then it picks $v_r$ as a 90 degree rotation of $v_i$ about the rotation axis. Then it picks the second eigenvector to be the complex conjugate.

All eigenvectors of a rotation matrix in 2D or 3D (not counting the axis eigen-vector), have the real part and imaginary part both orthogonal to each other and to the axis of rotation. And for each eigen-vector, the real part and imaginary part has the same magnitude. Conversely, any vector with real and imaginary parts meeting these conditions is an eigen-vector.