Prove that $e^{t(X+Y)}=e^{tX} e^{tY}$ implies $[X,Y]=0$

1k Views Asked by At

I am currently reading about the Baker-Campbell-Hausdorff formula and in a textbook on Lie Algebras, he shows that if

$$[X,[X,Y]] = 0 \quad \text{ and } [Y,[X,Y]] = 0$$

then

$$e^{Xt}e^{Yt} = e^{Xt + Yt + \frac{t^{2}}{2}[X,Y]}.$$

where $[X,Y] = XY-YX$ and $X,Y$ are square matrices. I later read on wikipedia that if

$$e^{X}e^{Y} = e^{(X+Y)},$$

this does not necessarily imply that $X$ and $Y$ commute, which leads me to believe that the converse of the result in the text is false.

How do you prove/disprove the converse? It seems difficult and laborious to construct and verify a counterexample (and I'm not even sure the converse is false); I tried to differentiate $e^{Xt}e^{Yt} = e^{Xt+Yt + \frac{t^{2}}{2}[X,Y]}$ and evaluate at zero, but it quickly turned into a mess due to the complexity of the higher order derivatives. I then tried to prove the much easier

$$e^{Xt}e^{Yt} = e^{(X+Y)t} \implies [X,Y] = 0$$

with the same technique (I hypothesized that if it can work with this statement, then it will work with the harder one after some effort), but I ended up with a collection of matrix products after the third derivative evaluated at zero that did not seem to help me at all.

Edit:

After some additional reading, I have found that $e^{X}e^{Y} = e^{(X+Y)} \implies [X,Y]=0$ is not true, but if $e^{Xt}e^{Yt} = e^{(X+Y)t}$ for all $t$, then this stronger statement will be true. This makes me suspect the converse I want to prove/disprove is true, but has put me no closer in actually proving it.

Edit 2:

After taking several higher order derivatives and evaluating at zero, I have been able to show that $[Y,[X,Y]] = 0$, but repeating this process again is becoming too difficult to compute by hand due to the large amount of nested commutators. I think there should be a way to use symmetry to use this result to conclude that $X$ must also commute with $[X,Y]$ here, but I don't know how to proceed.

2

There are 2 best solutions below

1
On BEST ANSWER

Take $ A=\begin{pmatrix} 0&0 \\ 0&2i\pi \\ \end{pmatrix}$, $\quad B=\begin{pmatrix} 0&1\\ 0 & 2i\pi \end{pmatrix}.$ Then you can show that $\exp(A)=\exp(B)=\exp(A+B)=I_2$, and $AB \neq BA$.

For the proof of $e^{Xt+Yt}=e^{Xt} e^{Yt} \Rightarrow [X,Y]=0$ : note that $$\exp(\frac{Xt}{n}).\exp(\frac{Yt}{n}).\exp(-\frac{Xt+Yt}{n}) = {\rm Id} + \frac{[Xt,Yt]}{2n^2} + o(n^{-2}),$$ hence $$\lim_{n \to +\infty} \left( \exp(\frac{Xt}{n}).\exp(\frac{Yt}{n}).\exp(-\frac{Xt+Yt}{n}) \right)^{2n^2} = \exp([Xt,Yt]).$$ since the sequence is constant to $1$, we have $\exp([Xt,Yt])=1$. Take the derivative at $t=0$ to conclude that $[X,Y]=0$.

0
On

The formula of Baker-Campbell-Hausdorff computes

$$log (\ exp(X) \ exp(Y))$$

as power series in X, Y and their higher commutators, e.g., Wikipedia. If you replace $X$ by $tX$ and $Y$ by $tY$ you obtain by equating powers of t:

$$exp(tX)exp(tY)=exp(tX+tY+(t^2/2)[X,Y]), t\in \mathbb R \implies [X,[X,Y]] + [Y,[Y,X]] = 0.$$

Of course the formula also gives the previous result

$$exp(tX)exp(tY)=exp(tX+tY), t\in \mathbb R \implies [X,Y] = 0.$$