Computationally inexpensive method to find a rotation which minimizes the norm of two tensor difference.

41 Views Asked by At

So I have two matrices ${\bf T}_1$ and ${\bf T}_2$, they are tensors in the sense that they can be built as $${\bf T} = \sum_{\forall i} a_i({\bf v_i}{\bf v_i}^T)$$ with positive real weights $a_i$ and vectors $\bf v_i$. Therefore they will be symmetric and have an ON-basis of eigenvectors with real non-negative eigenvalues ( spectral theorem, right? ).

Now to my question. What would be a fast way to calculate a rotation matrix $\bf R$ such that $\| {\bf T}_1 - {\bf R} {\bf T}_2 {\bf R}^T \|_F^2$ is minimized?

Bonus points if one could easily incorporate some functionality which punishes large angles. Maybe a regularization like $\lambda \|{\bf R-I}\|_F^2$ or some more suitable one.


My own work so far is limited to realizing we can do eigenvalue decomposition of ${\bf T}_1$ and ${\bf T}_2$ and then sort eigenvalues and find rotation which pairwise maps the eigenvectors. But clearly that can not be the fastest way..?

1

There are 1 best solutions below

0
On BEST ANSWER

The most you can hope to improve on the solution you found is a factor $2$. If you had a method better than that, you could use it to find the eigenvectors of $\mathbf T_2$ by finding the optimal $\mathbf R$ to align it with a non-degenerate diagonal matrix $\mathbf T_1$.