Trotter product formula for matrix exponentials

939 Views Asked by At

This is a solution to a problem given in Schilling's Brownian Motion.

Let $A,B \in \mathbb{R}^{d\times d}$ and set $P_t := exp(tA) := \sum_{j=0}^\infty (tA)^j / j!$

(Trotter Product Formula) Show that $e^{A+B} = \lim_{k\to \infty} (e^{A/k}e^{B/k})^k.$

Below are two solutions given to this problem. There are parts that I don't understand from each.

In the first solution, how do we get $\log(e^{A/k}e^{B/k}) =\frac{1}{k} A + \frac{1}{k}B + \sigma_k + \sigma_k '$ where $k^2 \sigma_k '$ is bounded? I don't see how we can get this form with $k^2 \sigma_k '$ is bounded. Also, how can we apply a logarithm series when we have matrices as the input?

In the alternative solution, I don't understand how we get the final bound for $\Vert S_k - T_k\Vert$. How do we get the bound $\frac{C}{k^2}$ here? I would greatly appreciate some help.

enter image description here

enter image description here

2

There are 2 best solutions below

0
On BEST ANSWER
  • First solution

You need to know the following fact :

Given a matrix $C$ with $\lVert C \rVert<1$ (where $\lVert \cdot \rVert$ is a matrix norm, i.e. one has $\lVert AB \rVert \leq \lVert A \rVert \lVert B \rVert$), one can define the logarithm of $\mathrm{id}+C$ by $$\log(\mathrm{id}+C) :=\sum_{n\geq 1} \frac{(-1)^{n+1}}{n}C^n=C-\frac{C^2}{2}+\dotsb.$$ The series converges and one has $\exp(\log(\mathrm{id}+C))=\mathrm{id}+C$.

Now let $C=\frac{1}{k}A+\frac{1}{k}B+\sigma_k$. One has

$$ \begin{align} \log(e^{A/k}e^{B/k})&=\log(\mathrm{id}+C) \\ &=C+\sum_{n\geq 2} \frac{(-1)^{n+1}}{n}C^n \\ &:=C+\sigma_k'. \end{align} $$

But one has $\sigma_k'=O(k^{-2})$. Indeed,

$$ \begin{align} k^2\lVert \sigma_k' \rVert&=k^2 \left\lVert \frac{C^2}{2}-\frac{C^3}{3}+\dotsb \right\rVert \\ &\leq k^2\left( \frac{\lVert C \rVert^2}{2}+\frac{\lVert C \rVert^3}{3}+\dotsb \right) \\ &=\lVert kC \rVert^2 \left(\frac{1}{2}+\frac{\lVert C \rVert}{3}+\dotsb \right) . \end{align} $$

When $k$ tends to infinity, $\lVert C \rVert$ tends to $0$ so

$$ \frac{1}{2}+\frac{\lVert C \rVert}{3}+\dotsb $$

is bounded. Besides, $\lVert kC \rVert = \lVert A+B+k\sigma_k \rVert$ is bounded as well. Hence $k^2\sigma_k'$ is bounded.

  • Second solution

Since the series are absolutely convergent, one can arrange the terms in any order :

$$ \begin{align} S_k-T_k &= \sum_{j=0}^{\infty} \frac{(A+B)^j}{k^j j!} - \sum_{j=0}^{\infty}\sum_{l=0}^{\infty} \frac{A^j}{k^jj!}\frac{B^l}{k^ll!} \\ &=\left(\mathrm{id}+\frac{A+B}{k}+\frac{(A+B)^2}{2k^2}+\dotsb\right)-\left(\mathrm{id}+\frac{A}{k}+\frac{B}{k}+\frac{A^2}{2k^2}+\frac{AB}{k^2}+\frac{B^2}{2k^2}+\dotsb\right) \end{align} $$

As we can see, the first few terms cancel each other out, and

$$ \begin{align} S_k-T_k &= \sum_{j=2}^{\infty} \frac{(A+B)^j}{k^j j!} - \sum_{\substack{j,l=0 \\ j+l\geq 2}}^{\infty} \frac{A^j}{k^jj!}\frac{B^l}{k^ll!}. \end{align} $$

Now both these sums can be controlled very crudely:

$$ \begin{align} k^2\left\lVert \sum_{j=2}^{\infty} \frac{(A+B)^j}{k^j j!} \right\rVert &\leq k^2\sum_{j=2}^{\infty} \frac{\lVert A+B\rVert ^j}{k^j j!} \\ &=\sum_{j=2}^{\infty} \frac{\lVert A+B\rVert ^j}{k^{j-2} j!} \\ &\leq \sum_{j=2}^{\infty} \frac{\lVert A+B\rVert ^j}{j!}\\ &\leq \exp(\lVert A+B\rVert) \end{align} $$

which is a constant, and

$$ \begin{align} k^2 \left\lVert\sum_{\substack{j,l=0 \\ j+l\geq 2}}^{\infty} \frac{A^j}{k^jj!}\frac{B^l}{k^ll!} \right\rVert&\leq \sum_{\substack{j,l=0 \\ j+l\geq 2}}^{\infty} \frac{\lVert A \rVert ^j \lVert B \rVert^l}{k^{j+l-2}j!l!} \\ &\leq \sum_{\substack{j,l=0 \\ j+l\geq 2}}^{\infty} \frac{\lVert A \rVert ^j \lVert B \rVert^l}{j!l!} \\ &\leq \sum_{j,l=0}^{\infty} \frac{\lVert A \rVert ^j \lVert B \rVert^l}{j!l!} \\ &=\exp(\lVert A \rVert)\exp(\lVert B \rVert), \end{align} $$ which is also a constant. Finally $k^2 \lVert S_k-T_k \rVert$ is bounded by the explicit constant (really not sharp) $C:=\exp(\lVert A+B\rVert)+\exp(\lVert A \rVert)\exp(\lVert B \rVert)$.

Remarks

  • The use of the notation $O(\cdot)$ would greatly simplify some of the reasoning

  • Sorry for using the letter $C$ in the first part, I wrote it before seeing that was the name of the constant in the second part.

0
On

HINT:

First, it is not hard to show that $$\lim_{n\to \infty} (1+ \frac{C}{n})^n = \exp C$$ for every matrix $C$, and moreover, the convergence is uniform on bounded susbsets of $M_{d}(\mathbb{C})$.

Second, if we have a sequence of matrices $C_n$ with limit $C$, then $$\lim_{n\to \infty}(1+ \frac{C_n}{n})^n = \exp C$$

This is a standard two $\epsilon$ argument. Take $\epsilon > 0$. There exists $N_{\epsilon}$ so that $\| \exp C- \exp C_n\| < \epsilon$ for all $n \ge N_{\epsilon}$. Now, since the sequence $C_n$ is also bounded, there exists $N'_{\epsilon}$ so that $\|\exp C_m - ( 1+ \frac{C_m}{n})^n\|< \epsilon$ for all $n \ge N'_{\epsilon}$ and all $m$ ...

Third, one sees that we have $$\exp \frac{A}{n} \cdot \exp \frac{B}{n} = I + \frac{1}{n}( A + B + \delta_n)$$ where $\delta_n \to 0$ ( in fact $\| \delta_n\| = O(\frac{1}{n})$)