The Optimal Rotation Around a Fixed Axis

81 Views Asked by At

So I have two sets of points, $P$ and $Q$, in the 3D Euclidian space. Set $P$ has $N$ points and set $Q$ has $M$ points. It is possible that $N=M$ but not necessarily. Now I define quantity $S$ as such:

$$ S = \sum_{i}^{N} \sum_j^{M} |\vec{x}_i-\vec{x}_j|^2 $$

By rotating $Q$ around a fixed axis by and angle $\theta$ as $P$ remains stationary I want to find the angle $\theta_0$ that maximizes $S$.

Question: Is there a closed-form solution to the correct rotation matrix in terms of $\theta_0$?

2

There are 2 best solutions below

4
On BEST ANSWER

I'm assuming you are given the axis of the rotation, say along unit vector $u$, and are looking for the optimal angle $\theta_0$ as a function of $u$.

I'm going to change your notation a little bit, so that I don't get confused between the points in $P$ and those in $Q$. Let $$P=\{x_i \text{ s.t. } 1\leq i\leq N\} \text{ and } P=\{y_j\text{ s.t. } 1\leq j\leq M\}$$ Now let $R$ be a rotation matrix. Then the effect of rotating $Q$ using $R$ would be $$\begin{split} S_R &=\sum_i \sum_j \|x_i-Ry_j\|^2\\ &= \sum_i \sum_j \left(\|x_i\|^2 +\|Ry_j\|^2-2\langle x_i, Ry_j\rangle\right)\\ &= \sum_i \sum_j \left(\|x_i\|^2 +\|y_j\|^2-2\langle x_i, Ry_j\rangle\right)\\ \end{split}$$ where the last equality holds because $\|Ry_j\|=\|y_j\|$. Therefore, the rotation must minimize $$T_R=\sum_i \sum_j \langle x_i, Ry_j\rangle$$ Now, decompose each vector into its component along and perpendicular to axis $u$: $$x_i=\langle x_i, u\rangle u + x_i^\perp \text{ and } y_j=\langle y_j, u\rangle u + y_j^\perp$$ With this, and noting that $Ru=u$, $$T_R = \sum_i \sum_j \left(\langle x_i, u\rangle \langle y_j, u\rangle + \langle x_i^\perp , Ry_j^\perp\rangle\right)$$ Let $\alpha_{ij}$ be the angle between $x_i^\perp$ and $y_j^\perp$. Then $$\begin{split} \langle x_i^\perp , Ry_j^\perp\rangle &= \|x_i^\perp\|\cdot\|y_j^\perp\|\cdot\cos(\alpha_{ij}+\theta)\\ &=\|x_i^\perp\|\cdot\|y_j^\perp\|\cdot(\cos(\alpha_{ij})\cos(\theta)-\sin(\alpha_{ij})\sin(\theta)) \end{split}$$ Therefore, differentiating $T_R$ w.r.t $\theta$ yields $$\frac{\partial T_R}{\partial \theta}=-\sum_i \sum_j \|x_i^\perp\|\cdot\|y_j^\perp\|\cdot(\cos(\alpha_{ij})\sin(\theta)+\sin(\alpha_{ij})\cos(\theta))$$ And equating to $0$ gives $$\tan \theta_0 = -\frac{\sum_i \sum_j \|x_i^\perp\|\|y_j^\perp\|\sin(\alpha_{ij})}{\sum_i \sum_j \|x_i^\perp\|\|y_j^\perp\|\cos(\alpha_{ij})}$$ This can be further simplified by noting that $$ \|x_i^\perp\|\|y_j^\perp\|\sin(\alpha_{ij}) = \langle x_i \times y_j, u\rangle = \det (x_i, y_j, u)$$ $$\|x_i^\perp\|\|y_j^\perp\|\cos(\alpha_{ij}) = \langle x_i, y_j\rangle - \langle x_i, u\rangle\langle y_j, u\rangle$$ Finally $$\boxed{\theta_0 = -\arctan \frac{\sum_{i=1}^N \sum_{j=1}^M \det (x_i, y_j, u)}{\sum_{i=1}^N \sum_{j=1}^M \left(\langle x_i, y_j\rangle - \langle x_i, u\rangle\langle y_j, u\rangle\right)}}$$ Equivalently, with $$x = \sum_{i=1}^N x_i \text { and } y = \sum_{j=1}^M y_j$$ $$\boxed{\theta_0 = -\arctan \frac{\langle x\times y, u\rangle}{\langle x, y\rangle -\langle x,u\rangle \langle y,u\rangle}}$$

3
On

No, not really. And the reason is simple if you think one dimension lower. Consider $P = Q = $ two opposite points on the unit circle, say $(1, 0)$ and $(-1, 0)$. Then "rotation by $\pi/2$" and "rotation by $-\pi/2$" both lead to a maximal value for your sum $S$. The "closed form solution" you seek would have to produce one of these two matrices, but which one?

More generally, in higher dimensions, unless $P$ and $Q$ have lots of points, there may be an infinite (continuous) family of optimal solutions. Think of the case where $P$ and $Q$ both consist of the points $(+1, 0, 0), (-1, 0, 0)$ in $\Bbb R^3$. Then a rotation of $\pi/2$ in the $xy$-plane gives you $(0, 1, 0)$ and $(0, -1, 0)$. You can follow this with any rotation in the $yz$-plane, and get an equally good solution.

To actually put this in the form of your question, let $$ Q = \{ (\pm 1, 0, 0) \} \\ P = \{ (\pm 0, 1, 0) \} \\ v = (0, 1, 0) $$ Then all rotations of $Q$ about the axis $v$ result in the same value of $S$. In fact, all rotations of $Q$ about the axis $(1,0,0)$ also result in the same value of $S$, as $Q$ is invariant under these rotations.

It really is hopeless.

In case you think that it's the "just two points" thing, consider $P = Q = $ the set of points of an $n$-gon on the equator (i.e., in the $xy$-plane of the unit sphere), and $v = (0,0,1)$. Then there are at least $n$ distinct rotations about $v$ that maximize $S$ (for if $\theta$ is such a rotation, so is $\theta + \frac{2\pi k}{n}$ for $k = 1, 2, \ldots, n-1$.

And now if you think that this is all because I keep putting all the points in a single plane, let $P = Q =$ the points of an octahedron, e.g., $$ \{(\pm1, 0, 0), (0, \pm 1, 0), (0, 0, \pm 1) \} $$ and $v = (1,0,0)$. Then there are at least 4 values of $\theta$ that maximize $S$.

Is symmetry essential? Nope. As Stefan's answer shows, all you really need is for the average of the points in either $P$ or $Q$ to be the zero vector, and his solution won't provide a unique answer. (And, for computational purposes, if the average is near zero, you're probably doomed as well, because the max will be so weak.)