A regular SVD decomposition of matrix $X\in\mathbb{R}^{n\times m}$ is $$ X = UDV^\top, \qquad U\in\mathbb{R}^{n\times r},\ D\in\mathbb{R}^{r\times r},V \in\mathbb{R}^{m\times r},$$ where $U$ and $V$ consist of orthonormal colomns, i.e., $U^\top U = V^\top V = I_r$.
Now, I want a slightly modified SVD, where everything remains the same except that $U$ satisfies a generalized orthonormal condition, $$ U^\top G U = I, \qquad G\in\mathbb{R}^{n\times n}. $$ A direct solution can be first doing a Cholesky decomposition $G = R^\top R$, then computing $\tilde{U}^\top D V^\top$ from a regular SVD of $RX$, and at last, transforming $\tilde{U}$ back to $U$ by left-multiplying $R^{-1}$.
However, the number $n$ can be large, in which case the inverse of $R$ would be of $O(n^3)$ complexity, far more costly than the $O(mn\min(m,n))$ complexity of a regular SVD. I would like to know if there is a way to compute more efficiently?
As $XV=UD$ and $U^TGX=DV^T\iff X^TGU=VD$, you get to solve the eigenvalue problem $$ \pmatrix{0&X\\X^TG&0}\pmatrix{u\\v}=\pm\sigma\pmatrix{u\\v} $$ One would have to check if the first step of a transformation into hessenberg form still produces a tri-diagonal matrix like in the Kahan-Golub algorithm for $G=I$.