The above question is the equation $(2.4)$ of the following paper:
MATRIX EXPONENTIATED GRADIENT UPDATES.
Let $M$ and $N$ be two $n \times n$ positive definite matrices where $M=U\Lambda U^{\top}$, $N=\tilde{U}\tilde{\Lambda} \tilde{U}^{\top} $ and $(\lambda_i,v_i)$ are eigenpairs of $M$, likewise for $N$.
How to show the following $$\text{Tr}(M\log N)=\sum_{i,j}\lambda_i\log(\tilde{\lambda_j})(u_i^{\top}\tilde{u}_j)^2$$
First I do not know what $i,j$ mean in summation, and how we have it using two summations. Second how to get that.
My try:
\begin{align} \text{Tr}(M\log N) &= \text{Tr}(U\Lambda U^{\top} \tilde{U}\log(\tilde{\Lambda})\tilde{U}^{\top}) \\ & = \text{Tr}(\Lambda U^{\top} \tilde{U}\log(\tilde{\Lambda})\tilde{U}^{\top}U) \end{align}
How can I proceed using matrix calculus to get the result not by expanding? what is the hidden trick?
To easy the notations, let us take $$ \kappa_j=\log\tilde{\lambda}_j $$ and $$ v_j=\tilde{u}_j. $$ With these notations, it suffices to show $$ \text{tr}\left(M\log N\right)=\sum_{i,j=1}^n\lambda_i\kappa_j\left(u_i^{\top}v_j\right)^2. $$
Now, let us begin our proof. Using \begin{align} M&=\sum_{j=1}^n\lambda_ju_ju_j^{\top},\\ N&=\sum_{j=1}^n\kappa_jv_jv_j^{\top}, \end{align} we have \begin{align} \text{tr}\left(M\log N\right)&=\text{tr}\left(\sum_{i=1}^n\lambda_iu_iu_i^{\top}\sum_{j=1}^n\kappa_jv_jv_j^{\top}\right)\\ &=\text{tr}\left(\sum_{i,j=1}^n\lambda_i\kappa_ju_iu_i^{\top}v_jv_j^{\top}\right)\\ &=\sum_{i,j=1}^n\lambda_i\kappa_j\text{tr}\left(u_iu_i^{\top}v_jv_j^{\top}\right)\\ &=\sum_{i,j=1}^n\lambda_i\kappa_j\text{tr}\left(v_j^{\top}u_iu_i^{\top}v_j\right)\\ &=\sum_{i,j=1}^n\lambda_i\kappa_j\left(v_j^{\top}u_i\right)\left(u_i^{\top}v_j\right)\\ &=\sum_{i,j=1}^n\lambda_i\kappa_j\left(u_i^{\top}v_j\right)^2. \end{align} This completes the proof.
I would like to thank @loupblanc for kind advices.