Let's say I have two sets of vectors $\{a_i\}_{i=1}^n$ and $\{b_i\}_{i=1}^n$. I want to find a rotation matrix $R^\star$ such that $$R^\star=\arg\min_R \sum_{i=1}^n (a_i^T R b_i)^2$$
Does this problem have a closed-form solution? If not, what is the best way to find $R^\star$?
If the sum of the dimension of the span of the $(a_i)_i$ and the dimension of the span of the $(b_i)_i$ is at most the dimension of the vector space, the minimum is $0$. To see this, assume without loss of generality that the $a_i$ are linearly independent and form an orthonormal basis (ONB, otherwise choose an ONB that spans the span of the $a_i$). Thus we can assume that we consider the first few unit vectors $v_1,\dots,v_k$, using coordinate transformations. Repeat the argument for the $b_i$ to reduce the problem to an ONB $w_1,\dots w_{\ell}$. Now, there exists a rotation that maps $\{w_1,\dots, w_\ell\}$ to $\{v_{k+1},\dots,v_{k+\ell}\}$.
Otherwise, the minimum is strictly greater than $0$, since otherwise the dimension of the span of $\{a_i,b_i:i=1,\dots,n\}$ would be greater than the dimension of the vector space. To my understanding, identifying $R^*$ in this case is fairly non-trivial. Even the relaxed problem over all orthogonal matrices is involved as discussed here. However, to be honest, I haven't looked into this problem in detail and it might be possible that in this special case a standard approach like the method of Lagrange multipliers works. Alternatively, a relaxation of the problem might help, since the cost function $f(M)=\sum_{i=1}^n(a_i^{\mathrm{T}}Mb_i)^2$ over all matrices is strictly convex.