Calculate Rotation Matrix to align k n dimensional vectors

1.3k Views Asked by At

I have a $k$ number of $n$-dimensional vectors written with respect to two rotated frames:

$X= \{\vec{x}_1,\vec{x}_2,...,\vec{x}_k\}$

and the same rotated vectors:

$X'= \{\vec{x'}_1,\vec{x'}_2,...,\vec{x'}_k\}$

How do I find the rotation matrix that transform all these vectors? From here I know how to find the rotation matrix from one single vector:

$\vec{x} = R.\vec{x}'$

but then how do I find the full rotation matrix that rotates the full $X'$ set at the same time?

EDIT2: From the answers bellow I now proceed as follows:

Build a $Y$ matrix made from the norm vectors $\hat{x_i}$ and a second $Y'$ from the norm vectors $\hat{x_i}'$ (both in the same order), with $i=1,...,n$ The rotation matrix is be written as:

$R = Y (Y')^{-1}$

I have tested the method with the following simple example:

Lets suppose I have two sets of vectors, $u = \{\vec{u}_1,\vec{u}_2,\vec{u}_3,...,\vec{u}_k\}$ and $v=\{\vec{v}_1,\vec{v}_2,\vec{v}_3,...\vec{v}_k\}$, where I know a priori that $v$ was built from the a rotation of $u$ (I used this to simulate $\vec{v}_i$).

Since we are in 3D I just need three vectors form the full set. Lets suppose that $u$ set had the following vectors:

$\vec{u}_1 = [3,5,2]$

$\vec{u}_2 = [1,2,8]$

$\vec{u}_3 = [4,3,10]$

and the respective associated $v_i$

$\vec{v}_1 = [4.20, 3.63, 2.69]$

$\vec{v}_2 = [6.78, -2.76,3.93 ]$

$\vec{v}_3 = [10.39,-1.82,3.71]$

Then, the transformation matrix R is constructed by::

$Y = (\hat{u_1},\hat{u_2},\hat{u_3})$

$Y' = (\hat{v_1},\hat{v_2},\hat{v_3})$

R = $Y(Y')^{-1}$

Now, any of remaining $k-n$ vectors can be mapped trough:

$\vec{u}_i = R \vec{v}_i$

2

There are 2 best solutions below

7
On

I think the following very minor variant of Gram-Schmidt should work reasonably well:

  1. Remove the largest-magnitude vector in $X$, say $x_i$. Remove the corresponding vector $x'_i$ from $X'$.
  2. Normalize $x_i$ to unit length and add it to the set $Y$. Do the same to $x'_i$ and $Y'$.
  3. Subtract the projection onto $x_i$ of of every (remaining) vector in $X$, so that they are all orthogonal to $x_i$. Do the same for $x'_i$ and $X'$.
  4. Repeat 1–3 until $Y$ and $Y'$ each have a full set of $n$ vectors.
  5. Write down $Y$ and $Y'$ as matrices of column vectors (naturally, in the same order as you added them). Now $R = Y(Y')^{-1}$ is a rotation matrix which maps $Y'$ to $Y$, and by rigidity it should also map the rest of $X'$ to $X$.

In Step 1, we don't really need to choose the largest-magnitude vector, just a non-zero vector. I figured choosing the longest vector would be less sensitive to rounding errors.

0
On

If $X$ has a subset of $n$ linearly independent vectors, then you can completely recover the transformation. Let $B$ be the matrix with these vectors as columns, and $B'$ be the matrix of their images (in the same order, of course). We know that $B'=RB$ for some rotation matrix $R$, therefore $R=B'B^{-1}$. All other vectors in $X$ are linear combinations of these chosen vectors, so $R$ will clearly map them correctly as well.

If you have fewer than $n$ linearly independent vectors, then it might not be possible to recover the entire original transformation, but you may not need to. If all of the vectors to be mapped are in the span of $X$, then you can arbitrarily extend the two sets of vectors to full bases for the space and proceed as above. You most likely won’t end up with a rotation matrix, but the resulting transformation will agree with the original on this subspace. Otherwise, you’ll have to use other information about the original rotation to fill in the missing pieces.

Update: Misread the question—looks like you wanted the transformation which maps $X'$ back to $X$. That’s obviously $BB'^{-1}$ instead.