Explicit or closed form of matrix product series expansion $\sum_{n=0}^{\infty} A^n B^n /n!$, which is close to matrix exponential $\exp{(A B)}$

274 Views Asked by At

Is there an explicit or closed form of the following matrix product series

$$ \sum_{n=0}^{\infty} \frac{A^n B^n}{n!} = ?$$

$A$ and $B$ are real valued square matrices that do not commute, i.e., $\left[A,B\right]\neq0$. $A$ is a diagonal matrix. $B$ is a symmetric matrix.

The power series representation $ \sum_{n=0}^{\infty} \frac{(A B)^n}{n!}$ of the matrix exponential $ \exp(A B) $ resembles to some extend the series above.

Note that there is an exact integral representation of the power series expression $\sum_{n=0}^{\infty} \frac{A^n B^n}{n!}$ that is \begin{align} A^{-1}\int_0^\infty dt e^{-t A^{-1}} \sum_{n=0}^\infty \frac{(2 \sqrt{t B})^{2n}}{4^n (n!)^2} = \sum_{n=0}^{\infty} \frac{A^n B^n}{n!} \end{align} Note that $ I_0 (x):=\sum_{n=0}^\infty x^{2n}/(4^n (n!)^2) $ is the zeroth order Bessel function of the first kind.

The underlying matrix ODE is \begin{align} \dot{Y}(t) = A Y(t) B \end{align} with the solution $Y(t) = \sum_{n=0}^{\infty} \frac{A^n B^n t^n}{n!}$.

However, I am looking for an expression that is explicit without any integral representation or ODE solution.

Also, it would be enlightening to consider the special case where A is the identity matrix plus a small and diagonal (perturbation) Matrix $\sigma$, i.e. $A\approx 1+\sigma$.

Any suggestions to solve this issue are welcome.

2

There are 2 best solutions below

2
On

Any suggestions to solve this issue are welcome.

I don't really have any good ideas, but if it so happened that A and B anticommuted, like the Pauli matrices $\sigma_3$ and $\sigma_1$, then you may note the alternating signs in $$ (AB)^2= -A^2 B^2,\\ (AB)^3= -A^3 B^3,\\ (AB)^4= A^4 B^4, \\ (AB)^5= A^5 B^5,\\ (AB)^6= -A^6 B^6,\\ (AB)^7= -A^7 B^7,... $$ recursively.

You then have
$$ Y=I +ABt-\frac{(AB)^2t^2}{2!} -\frac{(AB)^3t^3}{3!} +\frac{(AB)^4t^4}{4!} +\frac{(AB)^5t^5}{5!}- \frac{(AB)^6t^6}{6!}- \frac{(AB)^7t^7}{7!} +...\\ =\cos (ABt) + \sin (ABt), $$ which you may refashion into exponentials, if you must...

It manifestly solves your differential equation.


Addendum to revised question on diagonal perturbation

For a diagonal perturbation, the perturbation series is straightforward, $$ A^n= I + n\sigma +\frac{n(n-1)}{2!}\sigma^2+... ~~~~\leadsto \\ Y= \sum_{n=0}^\infty (I + n\sigma +\frac{n(n-1)}{2!}\sigma^2+...)B^nt^n/n! \\ = \left (I +\sigma Bt+ \frac{\sigma^2 B^2t^2}{2!} +...\right ) e^Bt. $$ You may observe, however, that to sum the entire perturbation series in σ, you need, of course, to know the answer for A, since it has an identical commutator with B. Shifting A by the identity cannot change the structure of the problem.

0
On

Given that the expression

$$Y(\lambda)=\sum_{n=0}^\infty\frac{\lambda^n}{n!}A^nB^n$$ satisfies the differential equation

$$\frac{dY}{d\lambda}=AY(\lambda)B$$

We can use a vectorization trick to simplify this equation into the more familiar form

$$\frac{d}{d\lambda} \text{vec}(Y)=(B^T\otimes A)\text{vec}(Y)$$

that can be easily solved :

$$\text{vec}(Y)=e^{\lambda (B^T\otimes A)}\text{vec}(I_{n})$$

where $I_n$ is the $n\times n $ identity matrix. Despite my best efforts, de-vectorizing this formula in a meaningful way has been difficult, and I believe that it isn't really desirable anyway, since Kronecker products offer natural solutions to matrix equations of the form $AX+XA^H=Q$ which admit a similar expansion as the one presented. Also, finding a numerical solution from the formula presented is straightforward to implement, although it is more expensive computationally to work out, since the dimensions of $B^T\otimes A$ is $n^2\times n^2$.