Repeated applications of a (rotation) matrix keep you in the same subspace?

621 Views Asked by At

Say I have a unit vector, $v$ in $\Bbb{R}^n$ ($|v|=1$). I multiply a rotation matrix, $R$ with this vector to get a new vector, $u=Rv$. Then, I get a third vector by applying the rotation matrix yet another time: $w=R^2v$. Is it true that $w$ lies in the space spanned by $v$ and $u$? Can this be proven or a counter-example shown?

And if this is true for rotation matrices, then is it true for other kinds of matrices as well?


My attempt: We must have $w=c_1v+c_2 w$ for some $c_1$ and $c_2$. In other words,

$$R^2v = c_1v+c_2 Rv$$

I'm stuck at this stage, unfortunately.

4

There are 4 best solutions below

0
On

Consider

$R=\begin{pmatrix}1 & 0 & 0 \\ 0 & 0 & -1 \\ 0 & 1 & 0\end{pmatrix}$

$u=\begin{pmatrix}1 \\ 1 \\ 0\end{pmatrix}$

$v = Ru = \begin{pmatrix}1 \\ 0 \\ 1\end{pmatrix}$

$w = R^2u = \begin{pmatrix} 1 \\ -1 \\ 0 \end{pmatrix}$

Since $u, v, w$ are linerally indep, $\mathrm{span}(u,v) \neq \mathrm{span}(v,w)$ and $w \notin \mathrm{span}(u, v)$

Click here to play around with this example

1
On

Counter-example:

\begin{gather} R = \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \end{bmatrix}, \quad |R|=1, \quad RR^T = R^TR = I, \quad R \begin{bmatrix} v_1 \\ v_2 \\ v_3 \end{bmatrix} = \begin{bmatrix} v_2 \\ v_3 \\ v_1\end{bmatrix} \\ \\ v = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \rightarrow u = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}, \; w = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} \end{gather}


Now suppose we want to find what matrices satisfy the desired subspace property. For certain matrices (I'm not sure of the conditions), you can decompose a vector as a linear combination of the matrice's eigenvectors, $$ v = \sum_i a_i p_i, \quad M p_i = \lambda_i p_i$$

We then get $u$ and $w$ as, $$ u = M v = \sum_i a_i \lambda_i p_i, \quad w = M u = \sum_i a_i \lambda_i^2 p_i,$$

Then imposing the subspace condition, \begin{gather} \sum_i a_i \lambda_i^2 p_i = c_1 \sum_i a_i p_i + c_2 \sum_i a_i \lambda_i p_i \\ \sum_i a_i \left(c_1 + c_2 \lambda_i - \lambda_i^2 \right) p_i = 0 \end{gather}

If you want the subspace property to be true independent of the starting vector $v$, then the above condition should be satisfied independent of $a_i$; i.e. there needs to exist $c_1$ and $c_2$ such that $c_1 + c_2 \lambda_i - \lambda_i^2 = 0 \; \forall \; i$. But given any specific $c_1$ and $c_2$, this gives a quadratic equation where at most two distinct values of $\lambda$ can satisfy. Thus, the matrix $M$ must have at most two distinct eigenvalues.

I hypothesize that if you want the subspace property, you need to multiply the matrix as many times as there are distinct eigenvalues.

1
On

There may be easier counterexamples, but this works:

Let $A$ be the rotation matrix of angle $\theta = \frac{\pi}{4}$ about the $z$-axis, ie. the $z$-axis is the axis of rotation. Then

$$A = \left[ \begin{array}{ccc} \cos \theta & -\sin \theta & 0 \\ \sin \theta & \cos \theta & 0 \\ 0 & 0 & 1 \end{array} \right] = \left[ \begin{array}{ccc} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0 \\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0 \\ 0 & 0 & 1 \end{array} \right].$$ Now take the unit vector $v = (0, \frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}). $ We have $$Av = \left[ \begin{array}{ccc} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0 \\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0 \\ 0 & 0 & 1 \end{array} \right] \left[ \begin{array}{c} 0 \\ \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} \end{array} \right] = \left[ \begin{array}{c} -\frac{1}{2} \\ \frac{1}{2} \\ \frac{1}{\sqrt{2}} \end{array} \right],$$ and then $$A^2v = \left[ \begin{array}{ccc} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0 \\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0 \\ 0 & 0 & 1 \end{array} \right] \left[ \begin{array}{c} -\frac{1}{2} \\ \frac{1}{2} \\ \frac{1}{\sqrt{2}} \end{array} \right] = \left[ \begin{array}{c} -\frac{1}{\sqrt{2}} \\ 0 \\ \frac{1}{\sqrt{2}} \end{array} \right].$$

Lastly, it is straightforward to check that $(-\frac{1}{\sqrt{2}}, 0, \frac{1}{\sqrt{2}})$ is not in the span of $(0, \frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}})$ and $(- \frac{1}{2}, \frac{1}{2}, \frac{1}{\sqrt{2}}).$

2
On

In 3 dimensions, you can argue geometrically. If the axis of the rotation is in the direction of $\mathbf{k}$, and the angle between $\mathbf{v}$ and $\mathbf{k}$ is $\phi$, then the angle between $R^n\mathbf{v}$ and $\mathbf{k}$ will also be $\phi$, for any $n$. So the rotated vectors lie on a cone with $\mathbf{k}$ as axis; only if $\mathbf{v}$ and $\mathbf{k}$ are perpendicular (or $\phi = \pi$) will $R^2\mathbf{v}$ lie in the same plane as $\mathbf{v}$ and $R\mathbf{v}$.

As for other matrices, you are essentially asking whether the matrix $R^2-c_1R-c_2I$ has an eigenvector with eigenvalue $0$, so has zero determinant. Applying this condition will give (in the technical sense - you get a cubic relation between $c_1$ and $c_2$) a set of possible values of $c_1$ and $c_2$ and so some vectors $v$ for which this is true. If you want it to be true for every vector, then you need matrix $R$ to satisfy the quadratic equation $R^2-c_1R-c_2I=0$, for some $c_1, c_2$.

In the case of $\mathbb{R}^n$, you can use the fact that orthogonal matrices can always be brought, by an orthonormal change of basis, into block diagonal form, with each block being a $2 \times 2$ rotation matrix, unless $n$ is odd, in which case one of the blocks is just $1$. Then the condition $\det(R^2-c_1R-c_2I) = 0 $ must hold for at least one of the blocks. With the $2 \times 2$ rotation having angle $\theta$ this reduces to $$(c_1- (1-c_2)\cos \theta)^2+\sin^2 \theta (c_2+1)^2=0.$$ One possibility is $\theta = 0$ or $\pi$, giving $c_1=\pm(1-c_2)$, but this is the rather trivial case. The other possibility is that $c_2=-1$ in which case, $c_1=2\cos \theta$. This case corresponds to the fact that any vector in the two-dimensional eigenspace corresponding to the block you have chosen will have all its rotations in the same hyperplane.