Operator Exponential $e^A e^B = e^{A+B}$

3.6k Views Asked by At

This question comes from an exam in my functional analysis class.

Suppose $X$ is a Banach space, and $T \in B(X,X)$ is a bounded linear operator on $X$. For any non-negative integer $n$, let $$S_n=\sum_{k=0}^n \frac{1}{k!} T^k$$ where $T^k$ is the composition of $T$ with itself $k$ times and $T^0=I$.

We can show that for any integer $k>0$, $\Vert T^k \Vert \le \Vert T \Vert ^k$. Then we can show that $S_n \in B(X,X)$ and there is some $S \in B(X,X)$ such that $S_n \to S$. We write $S = e^T$ for this operator.

Finally, We were asked to show that $e^T$ has an inverse, and that it is $e^{-T}$. My thought is to prove the following claim first: if $A,B \in B(X,X)$ and $AB = BA$, then $e^A e^B = e^{A+B}$. If the claim is true, it follows that $e^T e^{-T}=I$.

The claim can be proven provided that the product series can be computed with the Cauchy rule.

$$e^Ae^B=\sum_{i=0}^{\infty}\frac{A^i}{i!}\sum_{j=0}^{\infty}\frac{B^j}{j!}=\sum_{k=0}^{\infty}\sum_{l=0}^{k}\frac{A^lB^{k-l}}{l!(k-l)!}$$ $$=\sum_{k=0}^{\infty}\frac{1}{k!}\sum_{l=0}^{k}\frac{k!}{l!(k-l)!}A^lB^{k-l}= \sum_{k=0}^{\infty}\frac{1}{k!}(A+B)^k= e^{A+B}$$

But why can the product series be summed in the Cauchy way? I know for real-number series, by Cauchy's theorem, if $\sum_{n=1}^{\infty} a_n$ and $\sum_{n=1}^{\infty} b_n$ are absolutely convergent to $A$ and $B$, respectively, then we can add $a_i b_j$ in any way, and the resulting series will converge to $AB$. Does this proposition still hold for commutable operators? (It would be greatly appreciated if ideas of proof or reference is suggested.)

2

There are 2 best solutions below

6
On BEST ANSWER

In any Banach algebra, the Cauchy product of two absolutely convergent series is absolutely convergent, and with the expected sum. That is, if $\sum_{j=0}^\infty \|a_j\|<\infty$ and $\sum_{k=0}^\infty \|b_k\|<\infty$, then defining $c_m = \sum_{j+k=m}a_jb_k$, we get an absolutely convergent series and $$ \sum_{m=0}^\infty c_m = \left(\sum_{j=0}^\infty a_j \right)\left(\sum_{k=0}^\infty b_k\right) \tag1 $$ The proof is literally the same as the proof for real/complex numbers. One doesn't even need $a_j$ and $b_k$ to commute to have (1), because they are always multiplied in the same order. But here is a proof anyway.

Proof : Let $A_n$, $B_n$, $C_n$ denote partial sums over indices $0,\dots,n$. Consider the difference $A_nB_n-C_n$. It consists of all terms $a_j b_k$ where $ j,k\le n$ and $j+k>n$. By the triangle inequality, it suffices to prove that $$ \sum_{j,k\le n, \ j+k>n} \|a_j\| \,\|b_k\| \tag2 $$ is small when $n$ is large. Since either $j$ or $k$ has to be $>n/2$, we can estimate (2) from above by $$ \sum_{j\le n,\ n/2<k\le n} \|a_j\| \,\|b_k\| + \sum_{n/2<j\le n,\ k\le n} \|a_j\| \,\|b_k\| \tag 3$$ which can be rewritten as $$ \sum_{j\le n} \|a_j\| \sum_{n/2<k\le n} \|b_k\| + \sum_{k\le n}\|b_k\| \sum_{n/2<j\le n} \|a_j\| \tag 4$$ As $n\to \infty$, the first factor in each product stays bounded while the second factor goes to zero.

0
On

Technically, the post itself only asks about showing that the inverse of $e^{S}$ is $e^{-S}$. First, you can check that the power series $$e^{tS}=\sum\limits_{n=0}^\infty\frac{t^n}{n!}S^n$$ makes sense for all $t\in\mathbb{R}, S\in\mathcal{L}(E).$ Differentiating in $t$ gives $$\frac{d}{dt}e^{tS}=Se^{tS}=e^{tS}S.$$ Next, differentiate $e^{(s+t)S}e^{-tS}$ in $t$. You'll get that it's zero, so it's constant in $t$, and evaluation at $t=0$ gives $$e^{(s+t)S}e^{-tS}=e^{sS}.$$ Evaluate this at $s=0$ to get that $e^{tS}e^{-tS}=I$, which provides the desired inversion property (take $t=1$).

Next, we'll answer the question in your title. Compute that $$\frac{d}{dt}(e^{t(S+T)}e^{-tT}e^{-tS})=e^{t(S+T)}Se^{-tT}e^{-tS}-e^{t(S+T)}e^{-tT}Se^{-tS}.$$ We claim this equals zero. Indeed, the fact that $ST=TS$ implies that $$e^{-tT}S=\sum\limits_{n=0}^\infty \frac{(-t)^n}{n!}T^nS=S\sum\limits_{n=0}^\infty \frac{(-t)^n}{n!}T^n=Se^{-tT},$$ from which it follows that $Se^{-tT}=e^{-tT}S$.

Hence, $$e^{t(S+T)}e^{-tT}e^{-tS}$$ is constant in $t$. Evaluating at $t=0$ gives that $$e^{t(S+T)}e^{-tT}e^{-tS}=I.$$ Next, multiply on the right by $e^{tS}$, then by $e^{tT}$, and use the inversion property that we established earlier. This shows that $$e^{t(S+T)}=e^{tS}e^{tT}.$$ Evaluating at $t=1$ gives the result that you wanted.

This latter argument can be done directly using the holomorphic functional calculus, as well.