Prove that Tr(log(AB))=Tr(log(A))+Tr(log(B))

954 Views Asked by At

I am trying to prove that for positive-definite matrices $A$, $B$ and $C$,

$$\text{trace}(C\log(AB))=\text{trace}(C\log(A))+\text{trace}(C\log(B)).\quad(1)$$

While I don't know if it is a correct statement or not, I thought proving

$$\text{trace}(\log(AB))=\text{trace}(\log(A))+\text{trace}(\log(B))\quad(2)$$

can help me to prove $(1)$. But I couldn't solve $(2)$ as well. I saw this relationship on Wikipedia here with no proof, so I thought maybe the solution is very easy and I am missing something obvious.

2

There are 2 best solutions below

6
On BEST ANSWER

If you prove that

Hint : The trace is a linear operator for all square matrices $A$ and $B$ and scalar $c$: $\text{tr}(A + B) = \text{tr}(A) + \text{tr}(B)$

then I think you can proceed from that. You should try to define the log of a matrix and use matrix knowledge. See here and here and try to prove the properties used. This should be a straightforward and good exercise.

EDIT: The more interesting case showed in the comments is the case were $A,B$ are square matrices that don't commute, so $AB \neq BA$ and in that case we don't have that $\log(AB) = \log(A)+\log(B)$ necessarily. But there is this important proposition

Proposition: Let $A$ be a square matrix with real (or complex) entries. Then it is true that $$\det(e^A) = e^{\text{tr}(A)}$$

See that $e^A$ is well-defined. Now if we take the log in both sides of the equation we get

$$\text{tr}(A) = \log(\det(e^A)) \tag{1}$$

Now suppose that $\log(AB)$, $\log(A)$ and $\log(B)$ are all well-defined. This means that $\Vert AB - I \Vert < 1$, $\Vert A - I \Vert < 1$ and $\Vert B - I \Vert < 1$. Were here we denote the operator norm in the space of square matrices with real (or complex) entries.

$$\Vert A \Vert := \sup_{u \neq 0, u \in \mathbb{C^n}}\frac{\Vert Au \Vert_{\mathbb{C}}}{\Vert u \Vert_{\mathbb{C}}}$$

$$\Vert u \Vert_ {\mathbb{C}} := \sqrt{\vert u_1\vert ^2 + \dots +\vert u_n \vert^2}$$

With the above considerations we get that the taylor series of the matrix converges.

We also have a lema:

Lema: Let $A$ be a square matrix with real (or complex) entries such that $\Vert e^A - I \Vert < 1$. Then we get that $$\log(e^A)= A$$

With that we can just set $A \to \log(AB)$ and then

$$\text{tr}(\log(AB)) = \log(\det(AB)) = \log(\det(A)\det(B)) = \log(\det(A)) + \log(\det(B))= $$ $$ = \log(\det(e^{\log(A)})) + \log(\det(e^{\log(B)})) \stackrel{(1)}{=} \text{tr}(\log(A)) + \text{tr}(\log(B))$$

Reference in Portuguese

0
On

Let $U=\mathbb{C}\setminus \{z\in\mathbb{R};z\leq 0\}$ and let $Z_n$ be the subset of $M_n(\mathbb{C})$ constituted of the matrices without eigenvalues in $U$. We consider the principal log, which is uniquely defined by

for $n=1$. $re^{i\theta}\in U\rightarrow\log(re^{i\theta})=\log(r)+i\theta+2ki\pi$ where $\theta+2k\pi\in (-\pi,\pi)$.

For $n>1$, if $A\in Z_n$ is diagonalizable ($A=Pdiag(\lambda_i)P^{-1}$), then $\log(A)=Pdiag(\log(\lambda_i))P^{-1}$.

Remark 1. "$\log(A)$ is defined" does not imply that $||A-I||<1$.

Remark 2. If $A\in Z_n$, then $tr(A)$ is not necessarily $\log(\det(e^A))$. Indeed, let $n=2$, $A=2i\pi/3I_2$; then $tr(A)=4i\pi/3$ and $\log(\det(e^A))=\log(e^{4i\pi/3})=-2i\pi/3$. Yet, the result is true if $A$ is has only $>0$ eigenvalues.

Remark 3. Even if $AB=BA$ and $A,B\in Z_n$, $\log(AB)$ is not necessarily equal to $\log(A)+\log(B)$. For example, if $n=1$ and $A=e^{2i\pi/3}$, then $\log(A^2)=-2i\pi/3$ and $\log(A)=2i\pi/3$. Yet, if $A,B\in S⁺$ (they are symmetric $>0$) and $AB=BA$ then the result is true.

Proof of your equality (2). You can formally use the last two lines of the Rafael Wagner's post but, beware, that works because $A,B\in S⁺$ implies that $AB$ has only $>0$ eigenvalues (it's the key point!).

For you, it remains to prove (1) or at least have a look on (1); indeed, until now, I think that you are not too tired.

EDIT 1. @Mah , really, you are not serious. I thought you had done a few tests before conjecturing equality (1). I randomly chose positive matrices $ A, B, C $ and on first try, I found that (1) is not checked !! I will not answer your questions any more.

EDIT 2. We assume that the considered matrices are real.

Proposition. Let $A,B\in S^+$ and $U=\log(AB)-\log(A)-\log(B)$. Then equality (2)$\Leftrightarrow$

for every $C\in S$ (the symmetric matrices) $tr(CU)=0$

$\Leftrightarrow \log(AB)+\log(BA)=2(\log(A)+\log(B))$.

Proof. The fact that $S^+$ is open in $S$ gives the first equivalence. Thus $U$ is orthogonal to $S$ for the standard scalar product on $M_n$, that is equivalent to $U$ is a skew-symmetric matrix, and we are done.