Trace of matrix that is a product of 2 others.

6k Views Asked by At

We consider that $A,B$ are two square matrices. I would like to know if there is a proof that $$tr(AB)=tr(BA)$$

I seek for special kind of proof without using sigma notation and matrices multiplication definition because it is obvious then. Is there a more deep meaning for this trace property.?

Thanks!!

3

There are 3 best solutions below

8
On

It means that the sum of nonzero eigenvalues of $AB$ and $BA$ are the same.

A simple argument assuming that the trace is the sum of the eigenvalues is the following:

If $ABv=\lambda v$ with $\lambda\ne0$ and $v\ne0$, then $$BA(Bv)=\lambda Bv.$$ Notice that $Bv\ne0$, since otherwise $ABv=0$ but this is also $\lambda v\ne0$. Summing up, if $\lambda$ is a nonzero eigenvalue of $AB$, then $\lambda$ is an eigenvalue of $BA$.

To see that the Jordan structure of $AB$ and $BA$ is the same, multiply the identity $ABw=\lambda w+v$ on the left by $B$.

0
On

Let $V$ a finite dimensional $k$-vector space, then we have the following canonical isomorphism $\def\Hom{\operatorname{Hom}}\Hom(V,V) \cong V \otimes_k V^*$. And the trace corresponds to the canonical linear map $V \otimes_k V^* \to k$. The composition map $c \colon \Hom(V,V) \otimes_k \Hom(V,V) \to \Hom(V,V)$ is given by $$ \Hom(V,V) \otimes_k \Hom(V,V) \cong (V \otimes_k V^*) \otimes_k (V \otimes_k V^*) \cong V \otimes_k (V^* \otimes_k V) \otimes_k V^* \to V \otimes_k V^* $$ where the applied map is again the canonical map. Now let $s \colon \Hom(V,V) \otimes_k \Hom(V,V) \to \Hom(V,V) \otimes_k \Hom(V,V)$ denote the "swapping" map $s(A \otimes B) = B \otimes A$. Then we have to prove that $\mathrm{ tr}\circ c = \mathrm{tr} \circ c \circ s$, as maps
$$ (V \otimes_k V^*) \otimes_k (V \otimes_k V^*) \to k $$ Now applying $c$ applies the canonical map to the inner factors and applying the trace to the outer factors then. Applying $s$ just switches the factors, the former inner ones become outer ones and vice versa. Hence both maps agree on the simple tensors and by linearity on the whole space.

2
On

Every matrix can be decomposed into a symmetric and antisymmetric part:

$$T=S+A$$ where $S=\frac12(T+T^T)$ and $A=\frac12(T-T^T)$. Evaluate:

$$T_1T_2-T_2T_1=(S_1 S_2-S_2S_1)+(S_1 A_2-A_2S_1)+(A_1 S_2-A_2S_1)+(A_1 A_2-A_2A_1)$$ Now, the first and last terms give the trace identically zero, because $A_1A_2=(-A_1^T)(-A_2^T)=(A_2A_1)^T$ and the trace is independent of transposition (the same is true for $Tr(S_1S_2)$). The middle two terms consist of products of symmetric and antisymmetric matrices. To prove that $Tr(SA)$ (a product of a symmetric and antisymmetric matrix) is zero, you have to employ further properties of the trace. But that can be shown through diagonalization (which preserves trace). In the basis system of the matrix $S$, we have a product of a diagonal and antisymmetric matrix, which very obviously has nothing on the diagonal at all.

So, we proved $Tr(T_1 T_2)-Tr(T_2T_1)=0$.


Maybe a deeper meaning can be handwaved this way: frequently, a matrix represents a tensor (in physics), and it's useful to see tensors as directionally dependent generalizations of scalars. In that case, the trace picks out the isotropic (scalar) part of the quantity. It's very common to decompose a matrix into an isotropic part, traceless symmetric, and antisymmetric parts (the last one is also traceless by definition, but it makes sense to write it separately). But scalars commute. So the trace, which talks about the scalar part of the matrix, should be independent on the order of multiplication, just like scalars are.