As part of my Final Degree Project I am recreating some of the calculations in this paper by H. Cassini and M. Huerta.
In page 11 we are given the following equations (54-56):
\begin{equation} \begin{cases} \alpha^*n\beta^T-\alpha (n+1)\beta^\dagger=\frac{1}{2}\\ \alpha^*n\alpha^T+\alpha (n+1)\alpha^\dagger=X\\ \beta^*n\beta^T+\beta (n+1)\beta^\dagger=P \end{cases} \end{equation}
Where $\alpha$ and $\beta$ are unknown matrices, $n$ is a real diagonal matrix, $X$ and $P$ are both real and symmetric, and $1$ denotes the identity matrix, and $\dagger$ the conjugate transpose (I believe that is all releveant information).
From these equations, it is said in the paper that one can show that $\alpha$ and $\beta$ can be written as $\alpha=\alpha'U$, and $\beta=\beta'U$, where $\alpha'$ and $\beta'$ are both real and $U$ is a diagonal unitary matrix. This essentially amounts to say that all elements in the $i$-th column of both $\alpha$ and $\beta$ share the same complex argument (up to $\pi$, since the elements in $\alpha$ and $\beta$ are not necessarily positive). Correct me if I'm wrong. However, I have not managed to prove this, and I have been at it for quite some time.
I know that all three equations can be rewritten as:
\begin{equation} \begin{cases} \alpha\beta^\dagger=2i\mathrm{Im}(\alpha^*n\beta^T)-\frac{1}{2}\\ \alpha\alpha^\dagger=2\mathrm{Re}(\alpha^*n\alpha^T)+X\\ \beta\beta^\dagger=2\mathrm{Re}(\beta^*n\beta^T)+P \end{cases} \end{equation}
The last two tell us that $\forall i,j$ we have $\sum_k\alpha_{ik}\alpha_{jk}^*\in\mathbb{R}$ and $\sum_k\beta_{ik}\beta_{jk}^*\in\mathbb{R}$, but that doesn't necessarily imply that $\forall i,j,k$ $\alpha_{ik}\alpha_{jk}^*\in\mathbb{R}$ (idem for $\beta$), as the statement requires.
I assume there is something very trivial that I a missing, I just don't know what. I would appreciate any insights or suggestions as to how to proceed.
Assuming that you are not misquoting the authors, their claim is false. Consider e.g. $n=0$, $$ \alpha=\pmatrix{\frac{1}{\sqrt{2}}&\frac{1+i}{2}\\ \frac{-1+i}{2}&\frac{1}{\sqrt{2}}}, $$ $\beta=-\frac{1}{2}\alpha$, $X=\alpha\alpha^\dagger$ and $P=\beta\beta^\dagger$. Then $\alpha$ is unitary and all three equations are satisfied, but $\alpha$ cannot possibly be written as the product of a real matrix and a unitary diagonal matrix because $\frac{\alpha_{21}}{\alpha_{11}}$ is not real.
Their claim is true, however, when all singular values of $2n+1$ are distinct. The three equations mean that $$ \begin{align} &\alpha\beta^\dagger=-\frac{1}{2}-2i\operatorname{Im}(\alpha n\beta^\dagger),\tag{1}\\ &\alpha\alpha^\dagger\text{ is real},\\ &\beta\beta^\dagger\text{ is real}.\\ \end{align} $$ Hence $\alpha\alpha^\dagger$ and $\beta\beta^\dagger$ are real positive semidefinite and from the polar decompositions of $\alpha$ and $\beta$, we see that $\alpha=au$ and $\beta=bv$ for some real positive semidefinite matrices $a,b$ and some unitary matrices $u,v$. Now split equation $(1)$ by real and imaginary parts to obtain $$ \begin{align} &\operatorname{Re}(auv^\dagger b^T)=-\frac12,\\ &\operatorname{Im}(au(2n+1)v^\dagger b^T)=0.\tag{2}\\ \end{align} $$ Note that if $x^Ta=0$ for some real vector $x$, then $0=\operatorname{Re}(x^Tauv^\dagger b^T)=x^T\operatorname{Re}(auv^\dagger b^T)=-\frac12x^T$. Hence $x$ is necessarily zero and $a$ is nonsingular. Likewise, $b$ is nonsingular too. Therefore $(2)$ implies that $r=u(2n+1)v^\dagger$ is a real matrix. So, if all singular values of $2n+1$ are distinct and $q_1\sqrt{(2n+1)^2}\,q_2^T$ is a singular value decomposition of $r$ over $\mathbb R$, we must have $u=q_1w_1$ and $v=q_2w_2$ for some unitary diagonal matrices $w_1$ and $w_2$. Therefore $\alpha=(aq_1)w_1$ is the product of a real matrix $aq_1$ and a unitary diagonal matrix $w_1$ and likewise for $\beta$.