Infinite symmetrical matrix sum (discrete Lyapunov equation)

86 Views Asked by At

I have 2 symmetrical matrices ($A$ and $B$) and I am looking to find the sum $S$: $$S=A+BAB+B^2AB^2+\ldots$$ Or in summation format: $$S=\sum_{i=0}^\infty B^iAB^i$$ We know that the absolute magnitude of all the eigenvalues $\lambda_i$ of $B$ are less than $1$ so the sum converges.

I tried to use the same trick as for a geometric sum but I can't factor out both $B$'s at the same time.

To simplify, we can assume that $B$ is diagonal but I assume that the result would hold for any shape.

EDIT:

It seems to be similar to the the discrete Lyapunov equation, with: $$BSB-S+A=0$$ as all matrices are symmetrical here.

https://en.wikipedia.org/wiki/Lyapunov_equation

1

There are 1 best solutions below

0
On BEST ANSWER

It is a Lyapunov equation. With vectorisation, you may use the Neumann series trick. That is, $$ \operatorname{vec}(S)=(I\otimes I+B\otimes B+B^2\otimes B^2+\cdots)\operatorname{vec}(A)=(I\otimes I-B\otimes B)^{-1}\operatorname{vec}(A). $$ Since $(I\otimes I-B\otimes B)^{-1}$ in general is not a Kronecker product, the result of $(I\otimes I-B\otimes B)^{-1}\operatorname{vec}(A)$ does not de-vectorise to a nice matrix form.

Numerically, if $B=QDQ^T$ is an orthogonal diagonalisation and $C=Q^TAQ$, then $S=Q(C+DCD+D^2CD^2+\cdots)Q^T=QMQ^T$ where $m_{ij}=c_{ij}\sum_{k=0}^\infty d_i^kd_j^k=\frac{c_{ij}}{1-d_id_j}$. This offers an efficient way to evaluate $S$ when the matrices are small.