This question comes from an exam in my functional analysis class.
Suppose $X$ is a Banach space, and $T \in B(X,X)$ is a bounded linear operator on $X$. For any non-negative integer $n$, let $$S_n=\sum_{k=0}^n \frac{1}{k!} T^k$$ where $T^k$ is the composition of $T$ with itself $k$ times and $T^0=I$.
We can show that for any integer $k>0$, $\Vert T^k \Vert \le \Vert T \Vert ^k$. Then we can show that $S_n \in B(X,X)$ and there is some $S \in B(X,X)$ such that $S_n \to S$. We write $S = e^T$ for this operator.
Finally, We were asked to show that $e^T$ has an inverse, and that it is $e^{-T}$. My thought is to prove the following claim first: if $A,B \in B(X,X)$ and $AB = BA$, then $e^A e^B = e^{A+B}$. If the claim is true, it follows that $e^T e^{-T}=I$.
The claim can be proven provided that the product series can be computed with the Cauchy rule.
$$e^Ae^B=\sum_{i=0}^{\infty}\frac{A^i}{i!}\sum_{j=0}^{\infty}\frac{B^j}{j!}=\sum_{k=0}^{\infty}\sum_{l=0}^{k}\frac{A^lB^{k-l}}{l!(k-l)!}$$ $$=\sum_{k=0}^{\infty}\frac{1}{k!}\sum_{l=0}^{k}\frac{k!}{l!(k-l)!}A^lB^{k-l}= \sum_{k=0}^{\infty}\frac{1}{k!}(A+B)^k= e^{A+B}$$
But why can the product series be summed in the Cauchy way? I know for real-number series, by Cauchy's theorem, if $\sum_{n=1}^{\infty} a_n$ and $\sum_{n=1}^{\infty} b_n$ are absolutely convergent to $A$ and $B$, respectively, then we can add $a_i b_j$ in any way, and the resulting series will converge to $AB$. Does this proposition still hold for commutable operators? (It would be greatly appreciated if ideas of proof or reference is suggested.)
In any Banach algebra, the Cauchy product of two absolutely convergent series is absolutely convergent, and with the expected sum. That is, if $\sum_{j=0}^\infty \|a_j\|<\infty$ and $\sum_{k=0}^\infty \|b_k\|<\infty$, then defining $c_m = \sum_{j+k=m}a_jb_k$, we get an absolutely convergent series and $$ \sum_{m=0}^\infty c_m = \left(\sum_{j=0}^\infty a_j \right)\left(\sum_{k=0}^\infty b_k\right) \tag1 $$ The proof is literally the same as the proof for real/complex numbers. One doesn't even need $a_j$ and $b_k$ to commute to have (1), because they are always multiplied in the same order. But here is a proof anyway.
Proof : Let $A_n$, $B_n$, $C_n$ denote partial sums over indices $0,\dots,n$. Consider the difference $A_nB_n-C_n$. It consists of all terms $a_j b_k$ where $ j,k\le n$ and $j+k>n$. By the triangle inequality, it suffices to prove that $$ \sum_{j,k\le n, \ j+k>n} \|a_j\| \,\|b_k\| \tag2 $$ is small when $n$ is large. Since either $j$ or $k$ has to be $>n/2$, we can estimate (2) from above by $$ \sum_{j\le n,\ n/2<k\le n} \|a_j\| \,\|b_k\| + \sum_{n/2<j\le n,\ k\le n} \|a_j\| \,\|b_k\| \tag 3$$ which can be rewritten as $$ \sum_{j\le n} \|a_j\| \sum_{n/2<k\le n} \|b_k\| + \sum_{k\le n}\|b_k\| \sum_{n/2<j\le n} \|a_j\| \tag 4$$ As $n\to \infty$, the first factor in each product stays bounded while the second factor goes to zero.