How to solve for $\mathbf{X}$ the following equation
$$\mathbf{A} = a (b\mathbf{I} + \mathbf{X})^{-1/2} (c\mathbf{I} + \mathbf{X}) (b\mathbf{I} + \mathbf{X})^{-1/2}$$
where $a,b,c \in \mathbb{C}$ and $\mathbf{A} \in \mathbb{C}^{N \times N}$?
Thanks in advance!
Let $U^{1/2}=\exp(1/2 \log(U))$. But, what is this $\log$ ? And is $U^{1/2}$ a polynomial in $U$ ? That depends on the definition of the above $\log$. In particular, the question is badly written.
Everything works well when we assume that $U$ has no eigenvalues in $]-\infty,0]$ and we choose the principal $\log$. Then $\log(U)$ and, consequently, $U^{1/2}$ and $U^{-1/2}=({U^{1/2}})^{-1}$ are polynomials in $U$. They are primary matrix functions (cf. Higham, functions of matrices). More over $(A^{1/2})^2=A$. We assume that we are in the above situation.
Here the hypothesis about $X$ is: if $\lambda$ is an eigenvalue of $X$, then $b+\lambda\notin ]-\infty,0]$. Finally $A$ is a polynomial in $X$ and $AX=XA$. then the equation can be rewritten:
$$(bI+X)A=a(cI+X),\text{ or }X(A-aI)=acI-bA,$$ that is elementary.