Rotation matrix in arbitrary dimension to align vector

1.4k Views Asked by At

I was sure it's going to be trivial to do, but then got stuck.

Problem - given a vector $u\in\mathbb{R}^d$ in $d$ dimensions, find a rotation matrix $M$, such that

  1. (Rotation) $M^T=M^{-1}$,
  2. (Rotation) $\det(M) = 1$,
  3. $Mu = \lambda\cdot(1,0,\dots, 0)^T$, for some scalar $\lambda > 0$.

That is, $M$ aligns $u$ with the $\hat{x_1}$ vector.

2

There are 2 best solutions below

5
On BEST ANSWER

(Presumably $d>1$ and $u$ is not a scalar multiple of $(1,0,\ldots,0)^\top$, otherwise simply take $M=I$.)

The systematic way:

  1. Normalise $u$, i.e. replace $u$ by $u/\|u\|$.
  2. Apply Gram-Schmidt process to extend $u$ to an orthonormal basis $\{u_1=u, u_2, u_3, \ldots, u_d\}$.
  3. Put the column vectors $u_j$s together to form a matrix $Q$. Then $Q$ is a real orthogonal matrix. If $\det(Q)=-1$, multiply its last column by $-1$. Now $\det(Q)=1$.
  4. Set $M=Q^\top$ and we are done.

The quick and dirty way:

  1. Normalise $u$, i.e. replace $u$ by $u/\|u\|$.
  2. Let $v=u-(1,0,\ldots,0)^\top$ and take $M=I-\frac{2vv^\top}{\|v\|^2}$, i.e. $M$ is the Householder reflection matrix. You may verify that $M$ is real orthogonal and $Mu=(1,0,\ldots,0)^\top$.
  3. The determinant of Householder matrix is always $-1$. So, mulitply the last row of $M$ by $-1$ to get the final desired matrix.
1
On

Assuming $d>1$, $u\neq 0$, and $u\neq\lambda[1,0,\ldots,0]^T$, you can use a sequence of $d-1$ Givens rotations to zero out the $d-1$ (presumably nonzero) components of the vector $u$ as follows:

Let $u^{1}=u$ and $u^{k}=G(1,k,\theta_k)u^{k-1}$ for $k=2,\ldots,d$. For each $k$, the matrix $G(1,k,\theta_k)$ (defined in the linked wiki page) is chosen such that $$ \begin{bmatrix} c_k & -s_k \\ s_k & c_k \end{bmatrix} \begin{bmatrix} u^{k-1}_1 \\ u^{k-1}_k \end{bmatrix} = \begin{bmatrix} u^k_{1} \\ 0 \end{bmatrix}. $$ This leads to the formulas for cosine $c$ and sine $s$ at this step given by $$ c_k = u^{k-1}_1/r_k, \quad s_k = -u^{k-1}_k/r_k, \quad r_k=\sqrt{(u^{k-1}_1)^2+(u^{k-1}_k)^2} $$ (see here). The vector $u^d$ has the form $\lambda[1,0,\ldots,0]^T$, where $\lambda=\|u\|_2$.

The matrix $M$ is then given by $$ M=G(1,d,\theta_d)\cdot G(1,d-1,\theta_{d-1})\cdots G(1,2,\theta_2). $$ It is a product of $d-1$ Givens rotations, hence it is orthogonal and $\mathrm{det}(M)=1$.

NOTE: If $u_k^{k-1}$ is zero then you can skip the step $k$, that is, you can set $G(1,k,\theta_k)=I$ where $I$ is the identity matrix.