I want to calculate the inverse of a matrix of the form $S = (A\otimes B+C)$, where $A$ and $B$ are symetric and invertible, $C$ is a diagonal matrix with positive elements. Basically if the dimension is high, direct calculation can be expensive. So I wonder if it is possible to speed up the calculation taking advantage of some algebraic structures?
For example, let $A$ be a $m\times m$ matrix and $B$ be a $n\times n$ matrix, then we have $(A\otimes B)^{-1}=A^{-1}\otimes B^{-1}$ so that the approximate cost in floating point operations (flops) can be reduced from $7m^3n^3/3$ to $7(m^3+n^3)/3+m^2n^2$.
Since the matrix $S$ also has some special structure, any idea on the efficient calculation of $S^{-1}$?
Many thanks!
- In addition, what about $S^{−1}a$, where $a$ is a $mn\times 1$ vector.
Here's a method that might work well. Let $C$ have diagonal elements $d_1,\dots,d_{mn}$. Write $$ C = d_1 e_1e_1^T + \cdots + d_{mn} e_{mn} e_{mn}^T $$ Now, we can find the inverse by applying the Sherman-Morrison formula (iteratively) $mn$ times. That is, we've broken the computation into several rank-1 updates.
What's nice is that this allows you to leverage your formula for $(A \otimes B)^{-1}$. I'm not sure how efficient that will be in practice, but it's a start.