I would like to solve a matrix equation of the form $$ \mathbf{A} \mathbf{X} + \mathbf{X} \mathbf{A}^T = \mathbf{B} $$
where $\mathbf{A}$ and $\mathbf{B}$ are known $n \times n$ matrices, and $\mathbf{X}$ is an unknown $n \times n$ matrix.
- Is there a general way to isolate $\mathbf X$ in this expression?
- Is there a solution for various special cases, such as (1) the case where $\mathbf{A}$ and $\mathbf{B}$ are real, or (2) when $\mathbf{B}$ is real and diagonal?
My intuition is that the case where $\mathbf{A}$ and $\mathbf{B}$ are both real may be solved using the SVD of $\mathbf{A}$. Is there at least a way to compute an approximate solution for X using the pseudoinverse?
The Bartel Stewart algo. works only when the function $f:X\mapsto AX+XA^T$ is one to one.
Let $spectrum(A)=(\lambda_i)_i$. Since $spectrum(f)=\{\lambda_i+\lambda_j;i,j\}$, B.S. works iff for every $i,j$, $\lambda_i+\lambda_j\not= 0$. Note that if the previous condition is not fufilled, then the equation may have no solutions; for example, take
$A=diag(1,-1),B=\begin{pmatrix}0&0\\1&0\end{pmatrix}$.
More precisely, for this value of $A$, the equation $AX+XA^T=B$ has no solutions except for rare values of $B$.