Let $B \in M_{n \times n}(\mathbb{R})$ be a fixed $n \times n$ symmetric positive matrix. Consider the anti-commutator map
$$f:M_{n \times n}(\mathbb{R})\to M_{n \times n}(\mathbb{R})$$
via
$$f(C)=BC+CB$$
We can prove that $f$ is an injective linear operator and thus invertible. (Find a basis so that B is diagonal and everything is clear.) My question:
Is there a good expression for $f^{-1}$?
I am looking for an expression for $f^{-1}(D)$ as some polynomial or power series in terms of $B, B^{-1}, D$.
We can use the more general result solving the Lyapunov equation. We wish to solve the equation $$ BC + CB^H = D $$ for $D$. We changing the sign of both sides, we note that $-B$ is stable (in the sense that its eigenvalues have negative real part). We then find that $$ (-B)C + C(-B)^H = -D \implies\\ C = \int_0^\infty e^{-B\tau}(-D)e^{-B\tau}\,d\tau = -\int_0^\infty e^{-B\tau}De^{-B\tau}\,d\tau $$
So, we may write our inverse as $$ f^{-1}(D) = -\int_0^\infty e^{-B\tau}De^{-B\tau}\,d\tau $$ Of course, this uses a matrix-valued integral, and $e^{-B \tau}$ is implicitly a series on $B$.
If we write this map as a matrix with respect to the canonical basis of $M_{n \times n}$ (as is somewhat explained here), we find that $$ [f]_{\mathcal B} = I \otimes B + B^T \otimes I $$ and so, we're ultimately looking for the inverse of the sum of the two commuting matrices $I \otimes B$ and $B^T \otimes I$. We can get this inverse as a power series, though I'm not sure we can guarantee its convergence.
Let $M = I \otimes B + B^T \otimes I$. We have $$ (I \otimes B^{-1})M = I+ B^T \otimes B^{-1} $$ From there, we have $$ I+ B^T \otimes B^{-1} = \sum_{k=0}^\infty (-1)^k[[B^T] \otimes B^{-1}]^k $$ Moreover, $$ [(I \otimes B^{-1})M]^{-1} = M^{-1}(I \otimes B) = (I \otimes B)M^{-1} $$ All together, $$ M^{-1} = (I \otimes B)\sum_{k=0}^\infty (-1)^k[B^T \otimes B^{-1}]^k $$ Translated back into operators, we have $$ f^{-1}(D) = B \sum_{k=0}^\infty (-1)^k(B^{-k}D + DB^k) $$ However, we have no reason to suspect that this sum should converge. Notably, with the $2$-norm, we find that $$ \|B^T \otimes B^{-1}\| = (\lambda_{max}(BB^T \otimes (BB^T)^{-1}))^{1/2} = \|B\| \cdot \|B^{-1}\| \geq 1 $$
A strategy to get around non-convergence may be as follows: for $t > 0$, the function $$ f_t(C) = t\,BC + CB $$ has inverse $$ f_t^{-1}(C) = t^{-1} B \sum_{k=0}^\infty (-1)^k t^{-k} (B^{-k}D + DB^k) $$ and we can guarantee that this series converges for sufficiently large values of $t$. Perhaps we could then take a limit as $t \to 1^+$ of the resulting expression.