A certain unique rotation matrix

1.1k Views Asked by At

One can find that the matrix

$A=\begin{bmatrix} -\dfrac{1}{3} & \dfrac{2}{3} & \dfrac{2}{3} \\ \dfrac{2}{3} & -\dfrac{1}{3} & \dfrac{2}{3} \\ \dfrac{2}{3} & \dfrac{2}{3} & -\dfrac{1}{3} \\ \end{bmatrix} $

is at the same time $3D$ rotation matrix and for it the sum of entries in every column (row) is constant (here $-\dfrac{1}{3}+ \dfrac{2}{3} + \dfrac{2}{3} = 1)$.
The same is true if we change the order of columns in it.

For example:

$A_1=\begin{bmatrix} \dfrac{2}{3} & \dfrac{2}{3} & -\dfrac{1}{3} \\ -\dfrac{1}{3} & \dfrac{2}{3} & \dfrac{2}{3} \\ \dfrac{2}{3} & -\dfrac{1}{3} & \dfrac{2}{3} \\ \end{bmatrix} $ $A_2=\begin{bmatrix} \dfrac{2}{3} & -\dfrac{1}{3} & \dfrac{2}{3} \\ \dfrac{2}{3} & \dfrac{2}{3} & -\dfrac{1}{3} \\ -\dfrac{1}{3} & \dfrac{2}{3} & \dfrac{2}{3} \\ \end{bmatrix} $

Questions:
Is it any systematic way to find other non-trivial (without $0$ and $1$) rotation matrices with this property?
Especially it is interesting whether the above rotation matrices are the only ones with rational entries ?- maybe someone knows other rotation matrices exist where the sum of entries is constant..
and...
Can it be proved in some way that if the sum of entries in columns for a rotation matrix is constant then it should be equal to the length of column vectors?

3

There are 3 best solutions below

20
On BEST ANSWER

The three row vectors of the matrix define a triorthonormal frame, so that the tips of the vectors belong to a unit sphere. The sum of entries being a constant corresponds to a plane in the first octant, $x+y+z=c$, which intersects the sphere along a (small) circle.

You can take any three points on that circle that form an equilateral triangle and they fulfill the conditions on the rows (in addition to orthogonality). The transform is a rotation around the first octant bissector.

As the transform is orthogonal, the transpose of the matrix corresponds to the inverse rotation, also a rotation around the bissector, so that the condition on the rows also holds.

By the Rodrigues formula,

$$M=\cos(\theta)\left(\begin{matrix}1&0&0\\0&1&0\\0&0&1\end{matrix}\right)+\frac{1-\cos(\theta)}3\left(\begin{matrix}1&1&1\\1&1&1\\1&1&1\end{matrix}\right)+\frac{\sin(\theta)}{\sqrt3}\left(\begin{matrix}\ \ 0&\ \ 1&-1\\-1&\ \ 0&\ \ 1\\\ \ 1&-1&\ \ 0\end{matrix}\right).$$

There is a simple infinity of solutions. Rational ones occur for $\sin(\theta)=0$ or $\sin(\theta)=\pm\frac{\sqrt3}2$, but this brings nothing new. Other ones are found when $\cos(\theta)$ and $\dfrac{\sin(\theta)}{\sqrt3}$ are simultaneously rational ($p^2+3q^2=\cos^2(\theta)+\sin^2(\theta)=1$).

5
On

This most likely leads to the same set of solution as Yves Daoust's recipe. As I often use this method to generate examples of orthogonal matrices with rational entries (and it can be tweaked to satisfy the condition on coordinate sums as well) I will share it with you anyway.

Let $\vec{u}=(u_1,u_2,u_3)$ be any non-zero vector in $\Bbb{R}^3$. It is easy to see that the mapping $$ S:\vec{x}\mapsto\vec{x}-2\,\frac{\vec{x}\cdot\vec{u}}{\Vert\vec{u}\Vert^2}\,\vec{u} $$ is the orthogonal reflection with respect to the plane through the origin that has $\vec{u}$ as its normal. In particular $S$ is a length and angle preserving linear transformation. Therefore the triple $S(\mathbf{i})$, $S(\mathbf{j})$, $S(\mathbf{k})$ is an orthonormal system. Do observe that as a reflection $S$ changes the handedness, so the above system is left-handed. This poses no problem whatsoever, because rewriting them in the order $S(\mathbf{i})$, $S(\mathbf{k})$, $S(\mathbf{j})$ we get a right-handed orthonormal system. A $3\times3$ matrix with columns (or rows) forming a right-handed orthonormal system is a rotation matrix.

Observe that if we pose the extra constraint $u_1+u_2+u_3=0$, then from the formula for $S$ it is obvious that the coordinate sums of $\vec{x}$ and $S(\vec{x})$ are equal. Geometrically this follows from the fact that the reflected images differ from each other by a multiple of $\vec{u}$.

Let's see a few examples. The choice $\vec{u}=(1,1,-2)$ gives your example $$ \begin{aligned} S(\vec{i})&=(1,0,0)-\frac26\vec{u}&=(\frac23,-\frac13,\frac23),\\ S(\vec{k})&=(0,0,1)+\frac46\vec{u}&=(\frac23,\frac23,-\frac13),\\ S(\vec{j})&=(0,1,0)-\frac26\vec{u}&=(-\frac13,\frac23,\frac23). \end{aligned} $$ The choice $\vec{u}=(1,2,-3)$ gives another common example $$ \begin{aligned} S(\vec{i})&=(1,0,0)-\frac2{14}\vec{u}&=(\frac67,-\frac27,\frac37),\\ S(\vec{k})&=(0,0,1)+\frac6{14}\vec{u}&=(\frac37,\frac67,-\frac27),\\ S(\vec{j})&=(0,1,0)-\frac4{14}\vec{u}&=(-\frac27,\frac37,\frac67). \end{aligned} $$

Observe that by using a vector $\vec{u}$ with rational entries we get only rational coordinates. Because $\vec{u}$ and $\lambda\vec{u}$ determine the same reflection we might as well scale all the components of $\vec{u}$ to be integers.

It may be worth pointing out that swapping the order of $S(\mathbf{j})$ and $S(\mathbf{k})$ can be viewed as another reflection (w.r.t. to the plane bisecting the angle between those two vectors and containing $S(\vec{i})$. A composition of two reflections is a rotation.

===

Adding an argument settling the auxiliary question about the sum of the coordinates.

If $M$ is an orthogonal real $3\times3$ matrix with constant row sum $\lambda$, then it follows that the vector $(1,1,1)^T$ is an eigenvector of $M$ belonging to eigenvalue $\lambda$. Because multiplication by $M$ preserves lengths of vectors, we must have $|\lambda|=1$. Because $\lambda$ is real we must have $\lambda=\pm1$. Both signs occur. For if $M$ is a rotation matrix with row sum $+1$, then the row sums of $-M$ are all $-1$. By swapping two rows of $-M$ it becomes an orthogonal matrix with row sum $=-1$.

Observe that when the angle of a 3D rotation is not an integer multiple of $180$ degrees, then $+1$ is the only real eigenvalue of the rotation matrix (the other two being $e^{\pm i\theta}$ where $\theta$ is the angle of rotation). This also shows in Yves' answer in his observation that the rotation must have $(1,1,1)$ as its axis.

0
On

This answer is not an independent one, it's just a supplement to Yves' answer to have made his formula for generating rational values of required matrix more explicit (what is important for me and maybe useful for others)

So we have Yves' equation $ p^2+3g^2=1$ what can be written as $\left({\dfrac{a}{d}}\right)^2+3{\left({\dfrac{b}{d}}\right)^2} =1 $

then $ a^2+3b^2=d^2 $ and $(d-a)(d+a)=3b^2$ so we have $ d-a=3 $ and $d+a=b^2$.
Solution for $a$, $d$ as a $f(b)$: $d=\dfrac{b^2+3 }{2}$ $a=\dfrac{b^2-3 }{2}$.

Now we have generator for numbers $a$ and $d$ assuming $b$ is odd.

Examples (when $b$ is coprime with 3, although it seems not to be neccesary at general case):

$b=5$ : $d=\dfrac{5^2+3 }{2}=14$ $a=\dfrac{5^2-3 }{2}=11$.
$b=7$ : $d=\dfrac{7^2+3 } {2}=26$ $a=\dfrac{7^2-3}{2}=23$.
$b=11$ : $d=\dfrac{11^2+3 }{2}=62 $ $a=\dfrac{11^2-3}{2}=59$.
$b=13$ : $d=\dfrac{13^2+3 }{2}=86 $ $a=\dfrac{13^2-3}{2}=83$. ...

...infinite number of solutions..

Although there is an infinite number of matrices with properties given in the question the matrix listed as first seems still to be unique: it is the sole non-trivial descendant of $4$ (!) families of matrices:

  • matrices with rational entries,
  • rotation matrices,
  • matrices with constant row (column) sum of entries,
  • and symmetrical ones.