Let $\{X_1,\dots,X_K\}$ is a set of random matrices, where $X_k\in\mathbb{R}^{M\times N}, k=1,\dots,K$, and $U\in\mathbb{R}^{M\times r}$ and $V\in\mathbb{R}^{N\times r}$ are two matrices containing orthogonal columns (i.e., $U^\top U =I, V^\top V =I$). I was wondering, if the following question has a analytical solution:
$$\displaystyle\max_{U,V} \sum_{k=1}^K \|U^\top X_k V\|_F^2$$
If not, how should I solve it? Alternating optimization?
(At first, I thought it may be related to the SVD of the sum of the matrices $\{X_k\}$, but so far I have no hint to prove it.)
$$\begin{array}{ll} \text{maximize} & \displaystyle\sum_{k=1}^{K} \| \mathrm U^{\top} \mathrm X_k \mathrm V \|_{\text F}^2\\ \text{subject to} & \mathrm U^{\top} \mathrm U = \mathrm I\\ & \mathrm V^{\top} \mathrm V = \mathrm I\end{array}$$
Let
$$f_k (\mathrm U, \mathrm V) := \| \mathrm U^{\top} \mathrm X_k \mathrm V \|_{\text F}^2 = \mbox{tr} \left( \mathrm U^{\top} \mathrm X_k \mathrm V \mathrm V^{\top} \mathrm X_k^{\top} \mathrm U \right) = \mbox{tr} \left( \mathrm V^{\top} \mathrm X_k^{\top} \mathrm U \mathrm U^{\top} \mathrm X_k \mathrm V\right)$$
Hence,
$$\partial_{\mathrm U} \, f_k (\mathrm U, \mathrm V) = 2 \,\mathrm X_k \mathrm V \mathrm V^{\top} \mathrm X_k^{\top} \mathrm U$$
$$\partial_{\mathrm V} \, f_k (\mathrm U, \mathrm V) = 2 \,\mathrm X_k^{\top} \mathrm U \mathrm U^{\top} \mathrm X_k \mathrm V$$
Let the Lagrangian be
$$\mathcal L (\mathrm U, \mathrm V, \Lambda_1, \Lambda_2) := \sum_{k=1}^{K} f_k (\mathrm U, \mathrm V) - \langle \Lambda_1, \mathrm U^{\top} \mathrm U - \mathrm I \rangle - \langle \Lambda_2, \mathrm V^{\top} \mathrm V - \mathrm I \rangle$$
where the Lagrange multipliers $\Lambda_1$ and $\Lambda_2$ are symmetric matrices. Taking the partial derivatives with respect to $\mathrm U$ and $\mathrm V$,
$$\partial_{\mathrm U} \mathcal L (\mathrm U, \mathrm V, \Lambda_1, \Lambda_2) = 2 \sum_{k=1}^{K} \mathrm X_k \mathrm V \mathrm V^{\top} \mathrm X_k^{\top} \mathrm U - 2 \mathrm U \Lambda_1$$
$$\partial_{\mathrm V} \mathcal L (\mathrm U, \mathrm V, \Lambda_1, \Lambda_2) = 2 \sum_{k=1}^{K} \mathrm X_k^{\top} \mathrm U \mathrm U^{\top} \mathrm X_k \mathrm V - 2 \mathrm V \Lambda_2$$
Finding where the partial derivatives vanish, we obtain two cubic matrix equations in $\mathrm U, \mathrm V, \Lambda_1, \Lambda_2$ and two quadratic matrix equations in $\mathrm U$ and $\mathrm V$
$$\boxed{\begin{array}{rl} \displaystyle\sum_{k=1}^{K} \mathrm X_k \mathrm V \mathrm V^{\top} \mathrm X_k^{\top} \mathrm U &= \mathrm U \Lambda_1\\ \displaystyle\sum_{k=1}^{K} \mathrm X_k^{\top} \mathrm U \mathrm U^{\top} \mathrm X_k \mathrm V &= \mathrm V \Lambda_2\\ \mathrm U^{\top} \mathrm U &= \mathrm I\\ \mathrm V^{\top} \mathrm V &= \mathrm I \end{array}}$$
How can one solve these matrix equations? I do not know.