Given a set of vectors $X\in \mathbb R^{n\times d}$, I'm trying to find a rotation matrix $R\in\mathbb R^{d\times d}$ (that is $R R^T=I$) that minimizes the maximum column norm of the product $XR^T$.
If $A=XR^T$, according to Wikipedia, this is also known as the $\|A\|_{\infty,1} = \max_{1\le j\le n}\|A_{:j}\|_2$ norm. (Though I'm not sure if that's a mistake, and it should be the $\|A\|_{\infty,2}$ norm?)
In an case, if we didn't have the restriction that $R$ has to be a rotation, and we were optimizing instead $\max_{1\le j\le n}\|A_{:j}\|_1$, it seems like we could just use linear programming.
So I'm wondering if it might be possible to solve my "quadratic" version using Semidefinite Programming (SDP)?
Or maybe the orthogonality constraint on $R$ is not convex, and we have to try to solve the problem using heuristic methods. Does anybody know if this problem have been studied before and what methods tend to work better?
Note first of all that the problem is equivalent to finding $R$ such that the maximal diagonal entry of the positive semi-definite matrix $RX^TXR^T$ is minimized. Since the trace is fixed (it is the same as the trace of $X^TX$), you just want to make all diagonal entries equal to each other. The quickest way to do it I can think of is just to run the rotation method in reverse. Take the largest and the smallest norm columns (or, which is the same, the largest and the smallest diagonal entries in $X^TX$) and apply a 2D rotation between them that makes one of the resulting norms exactly right ($1/d$ of the trace). Then put this one aside and repeat with one fewer columns and so on. I believe you'll be able to implement this algorithm yourself, but if you have any questions, do not hesitate to ask. :-)