I have known, symmetric, positive definite matrices $w$ and $s$ such that $$ w = RsR' $$
I want to solve for/characterize $R$ where $R$ is orthogonal and has determinant one (i.e. is a rotation matrix). Note: ' denotes transpose. All of $w, s, R \in \mathbb{R}^{n \times n}$. Solutions to $R$ aren't unique (i.e. +/- rotation in 2d) but I'd like to characterize their form.
Thoughts: Can take eigendecomposition of $w$ and $s$ and, noting that the eigenvalues of $s$ and $d$ are equal, we have: $$ w = W D W' $$ $$ s = S D S' $$
Substituting into the first equation, we have: $$ W D W' = R S D S' R' $$ $$ D = W' R S D S' R' W $$
But all I think I can say is that $(W' R S) (S' R' W) = (W' R S) (W' R S)' = I$ which gives no useful characterization of $R$.
We also know that for $w_i, s_i$ the $i^{th}$ eigenvectors of $w, s$ respectively, and $\lambda_i$ their (shared) eigenvalues: $$ R w_i = s_i $$ $$ w w_i = \lambda_i w_i $$ $$ s s_i = \lambda_i s_i $$
for $i = 1 \dots n$
I can make an overdetermined system of equations from this but I feel like I'm missing something easier. Any help is appreciated!
Here is my guess:
If you multiply both sides by $R$ on the right you get:
$w R - R s = 0$
Which is a special case of the Sylvester equation:
https://en.m.wikipedia.org/wiki/Sylvester_equation
You can form a linear system to solve for $R$ using matrix vectorization:
https://en.m.wikipedia.org/wiki/Vectorization_(mathematics)
$A r = 0$
Where $r$ here is a column vector. The $r$ with min norm (best solution in the least squares sense) is the right-singular vector of $A$ associated with the smallest singular value. You can find it using SVD:
https://en.m.wikipedia.org/wiki/Singular-value_decomposition
Min norm of SVD solution should enforce $R R' = I$.