I am facing the following system of matrix equations:
$$ b X = B $$ $$ X A X^T = I $$
where $X$ is the square matrix of dimension $N \times N$ to solve for, $b$ and $B$ are both row vectors of dimension $1 \times N$, $A$ is a symmetric positive semidefinite matrix of dimension $N \times N$ and $I$ the identity matrix of dimension $N \times N$.
I've tried out a variety of ways to approach the problem, one of which being through the quadratic matrix equation (obtained by combining the two above equations) $$ X A X^T + b^T b X - (I + b^T B) = 0 $$ I did not however manage to come up with an analytical solution.
I'm suprised by how such a seemingly 'simple' problem is in fact quite sneaky and tricky to crack... but I might be approaching it completely wrong!
I would be very grateful if you could point out to something I missed or any clever idea to obtain the closed-form solution.
Remark 1 : This problem is actually a reformulation from a problem I encountered reading the following paper, where the matrix $X$ defined above corresponds to the matrix $D$ of in section 5 pp.9-10 of the paper ($D$ encodes a change of variable allowing to map one random Gaussian vector to a standard one, with an additional constraint, hence the system of 2 equations above). It should be noted that in the original problem, the vector $b$ has a specific structure: $$ b = [b_1, 0, \dots, 0] $$ If it turns out that this "sparse" structure can be leveraged to make the problem easier to solve, please feel free to provide an answer in that direction.
Remark 2 : With the "sparse" structure of $b$ and in the 2-dimensional case, it's straightforward to come up with an analytical solution by hand, as the first equation then determines the first row of matrix $D$ and plugging this in the second equation allows to solve for the other components. The matrix $D$ (or $X$ as defined in my post) thereby obtained is not symmetric... which may give a hint on the general structure of the solution we are looking for (and that it might not be a good idea to go at the quadratic matrix equation as I seem to remember there are only "easy solutions" for the symmetric case). In any case, this hand-wavy approach becomes very tedious as $N >= 3$, so it's a no-go.
Remark 3 : The authors do not mention how to obtain the matrix $D$, which either means the solution is immediate... or not straightforward at all. It kind of reminds me of those "proofs left for homework" teachers love to leave their students :)
This is very straightforward.
For convenience and clarity, let us write $b$ and $B$ as $u^T$ and $v^T$ respectively. The constraint $bX=B$ now reads $u^TX=v^T$. Clearly, the system of equations is solvable only if $A$ is positive definite and $u^Tu\,(=u^TXAX^Tu)=v^TAv$.
Suppose both necessary conditions are satisfied. We shall see that they constitute a set of sufficient conditions too. Write $A$ as a Gram matrix $LL^T$. E.g. $L$ can be taken as the Cholesky factor of $A$ or the positive definite square root of $A$. The necessary condition that $u^Tu=v^TAv$ is then equivalent to $\|u^T\|=\|v^TL\|$, and the equation $XAX^T=I$ is equivalent to $(XL)(XL)^T=I$, i.e., $Q=XL$ is an orthogonal matrix.
Thus the problem boils down to finding an orthogonal matrix $Q$ such that $u^TQL^{-1}=v^T$, or equivalently, that $QL^Tv=u$. Since $\|L^Tv\|=\|u\|$, the orthogonal matrix $Q$ exists. E.g. when $L^Tv\ne u$, you may take $Q$ as the Householder reflection along $L^Tv-u$. Alternatively, you may take $Q$ as the rotation matrix that rotates $L^Tv$ to $u$ and whose restriction on $\left(\operatorname{span}\{L^Tv,u\}\right)^\perp$ is the identity map.
Once $Q$ is obtained, we have $X=QL^{-1}$.