Background
I am solving an assignment on the fundamentals of Quantum Computing which mostly has questions on Linear Algebra. All matrices and vectors are over complex domains.
Question
Define the exponent of a matrix as $$e^{\lambda A} = \sum_{i = 0}^{\infty}\frac{\lambda^i}{i!}A^i$$ where $A^0 = \mathbb{I}$ or the identity operator.
- Given a matrix $B$, the matrix logarithm of $B$ is defined as the matrix $A$ such that $e^A = B$. Prove that $\log$ is a non-unique function (i.e. for every $B$, there are several $A$ that satisfy the matrix formula given).
- Prove that for two matrices, if $AB = BA$, then $\log(AB) = \log(A) + \log(B)$.
My approach
- For this part, I am firstly not sure about the existence of an $A$, it doesn't seem obvious to me why such an $A$ should always exist and if so, how to compute it. Nonetheless, assuming that we have a solution to $\log(B) = A$, then I try to prove that I can derive a family of solutions from it (this is my intuition) $$\therefore B = \mathbb{I} + A + \frac{A^2}{2!} + \frac{A^3}{3!} + \ldots $$ Assume we have an $A' = A + 2\pi i \mathbb{I}$ such that $$ e^{A'} = \mathbb{I} + (A+2\pi i)\mathbb{I} + \frac{({A + 2\pi i\mathbb{I}})^2}{2!} + \frac{({A + 2\pi i\mathbb{I}})^3}{3!} + \ldots $$ Since $A$ and $\mathbb{I}$ always commute, we can write the sum as (the proof is just algebraic manipulation and application of the binomial theorem, so I've skipped it) $$e^{A'} = \left(\mathbb{I} + A + \frac{A^2}{2!} + \frac{A^3}{3!} + \ldots\right)\left(\mathbb{I} + \frac{2\pi i}{1!}\mathbb{I} + \frac{(2\pi i)^2}{2!}\mathbb{I} + \frac{(2 \pi i)^3}{3!} \mathbb{I} + \ldots \right)$$ $$\therefore e^{A'} = e^A \mathbb{I} \left(1 + \frac{2\pi i}{1!}+ \frac{(2\pi i)^2}{2!}\ + \frac{(2 \pi i)^3}{3!} + \ldots \right)$$ $$\therefore e^{A'} = e^A \mathbb{I} e^{2 \pi i} = B$$ Similarly $\forall n \in \mathbb{Z}, \exp{(A + 2n \pi i \mathbb{I})} = B$ This finishes the proof
- Assume $e^X = AB, e^Y = A, e^Z = B$ To prove that $X = Y + Z$ We write out the expansions of $A$ and $B$ and then use the fact that $AB = BA$ $$AB = \mathbb{I} + (Y + Z) + \frac{Y^2 + 2YZ + Z^2}{2!} + \frac{Y^3 + 3Y^2Z + 3YZ^2 + Z^3}{3!} + \ldots$$ $$BA = \mathbb{I} + (Y + Z) + \frac{Y^2 + 2ZY + Z^2}{2!} + \frac{Y^3 + 3ZY^2 + 3Z^2Y + Z^3}{3!} + \ldots$$ Put $AB - BA = \mathbb{O}$ $$\therefore (YZ - ZY) + \frac{1}{2!} (Y^2Z - ZY^2) + \frac{1}{2!}(YZ^2 - Z^2Y) + \ldots = 0$$
This is where I am stuck Obviously if $Y$ and $Z$ commute, then the above expression equates to 0 and thus from the expansions of $AB$ and $BA$, if $Y$ and $Z$ would commute, $AB = BA = e^{Y+Z} = e^X$ and thus our proof is complete.
The question is: how do I prove that this is the only case where the summation is 0? If $Y$ and $Z$ don't commute, then is it possible to proceed with the proof? Are there any alternate approaches?
As a comment to your question alludes to, you can only take the logarithm of invertible matrices because $e^{A}$ is necessarily invertible: $e^{A}e^{-A}=I$.
As for the second part of your question, as you've said, it's just about showing that $e^C e^D=e^{C+D}$ whenever $C$ and $D$ commute.
By the way, the expanded power-series sum on each side is absolutely convergent, which means that it doesn't matter in which order you add up the terms of the sum. That's the formal justification for then rearranging the terms until they fit the same proof in basic analysis which shows that $e^{x+y}=e^x e^y$ for all complex numbers $x$ and $y$.