It is a well known fact that if $A,B\in M_{n\times n}(\mathbb C)$ and $AB=BA$, then $e^Ae^B=e^Be^A.$
The converse does not hold. Horn and Johnson give the following example in their Topics in Matrix Analysis (page 435). Let $$A=\begin{pmatrix}0&0\\0&2\pi i\end{pmatrix},\qquad B=\begin{pmatrix}0&1\\0&2\pi i\end{pmatrix}.$$ Then $$AB=\begin{pmatrix}0&0\\0&-4\pi^2\end{pmatrix}\neq\begin{pmatrix}0&2\pi i\\0&-4\pi^2\end{pmatrix}=BA.$$ We have $$e^A=\sum_{k=0}^{\infty}\frac 1{k!}\begin{pmatrix}0&0\\0&2\pi i\end{pmatrix}^k=\sum_{k=0}^{\infty}\frac 1{k!}\begin{pmatrix}0^k&0\\0&(2\pi i)^k\end{pmatrix}=\begin{pmatrix}e^0&0\\0&e^{2\pi i}\end{pmatrix}=\begin{pmatrix}1&0\\0&1\end{pmatrix}.$$
For $S=\begin{pmatrix}1&-\frac i{2\pi}\\0&1 \end{pmatrix},$ we have $$e^B=e^{SAS^{-1}}=Se^AS^{-1}=S\begin{pmatrix}1&0\\0&1\end{pmatrix}S^{-1}=\begin{pmatrix}1&0\\0&1\end{pmatrix}.$$
Therefore, $A,B$ are such non-commuting matrices that $e^Ae^B=\begin{pmatrix}1&0\\0&1\end{pmatrix}=e^Be^A.$
It is clear that $\pi$ is important in this particular example. In fact, the authors say what follows.
It is known that if all entries of $A,B\in M_n$ are algebraic numbers and $n\geq 2,$ then $e^A\cdot e^B=e^B\cdot e^A$ if and only if $AB=BA.$
No proof is given. How does one go about proving that?
The proof comes from Wermuth's "Two remarks on matrix exponential" (DOI 10.1016/0024-3795(89)90554-5)
It seems to me that it is enough to assume that no two distinct eigenvalues of $A$ or $B$ differ by an integer multiple of $2\pi i$ (which is true if, but not only if $\pi$ is transcendental with respect to their entries). This is of course true if $A$ and $B$ have algebraic entries.
The basic idea is to somehow reverse the exponential function, and express $A$ and $B$ as power series (in fact, polynomials) in their exponentials.
Let $m(\lambda)=\prod_j (\lambda-\lambda_j)^{\mu_j}$ be the minimal polynomial of $A$. Then by assumption $e^{\lambda_j}$ are all different, so by Hermite's interpolation theorem we can find a polynomial $f$ such that for $g=f\circ \exp$ we have that $g(\lambda_j)=\lambda_j$, and if $\mu_j>0$, then $g'(\lambda_j)=1$ and $g^{(l)}(\lambda_j)=0$ for $2\leq l< \mu_j$.
Then we have $g(A)=A$ (this part I' don't really see, but apparently it's common knowledge, it's probably a corollary of Jordan's theorem on normal form), but $g(A)=f(e^A)$, so $A=f(e^A)$, and similarly for some polynomial $h$ we have $B=h(e^B)$, so $AB=f(e^A)h(e^B)=h(e^B)f(e^A)=BA$.