Show that $\log(\det(H_1)) ≤ \log(\det(H_2)) + \operatorname{tr}[H^{-1}_2H_1]−N$ for all positive semidefinite matrices $H_1,H_2 \in C^N$.
We know that all positive semidefinite matrices are singular and so the determinant is zero and as such they are not invertible. It is clear from the expression that $\log(\det(H_1)) = \log(\det(H_2)) = \log(0) = -\infty$.
Also in the right hand side, inverse of $H_2$ does not exist.
I would be grateful if someone can through some light how to proceed with this proof. Is there any specific property of positive semidefinite matrix to handle this?
We consider that $H_1,H_2\in S_n^{>0}$ (they are $>0$); the other cases have not any interest.
Let $f:Z\in S_n^{>0}\rightarrow tr(H_2^{-1}Z)-\log(\det(Z))+\log(\det(H_2)-n$. We show that the minimum of $f$ is $0$.
The derivative is $Df_{Z}:K\in S_n\rightarrow tr(H_2^{-1}K)-tr(KZ^{-1})=tr(K(H_2^{-1}-Z^{-1}))$ (indeed, the tangent space to $S_n^{>0}$ in $Z$ is $S_n$, the space of symmetric matrices). Thus $Df_{Z}=0$ iff for every symmetric $K$, $tr(K(H_2^{-1}-Z^{-1}))=0$.
Finally, $H_2^{-1}-Z^{-1}$ is in the orthogonal of $S_n$, that is the space of skew symmetric matrices. That implies $H_2^{-1}=Z^{-1}$ or $Z=H_2$.
Then, if $f$ admits a local extremum, it is necessarily $f(H_2)=0$.
It suffices to show that $f$ is convex (note that $S_n^{>0}$ is convex).
The second derivative is
$D^2f_Z(K,L)=tr(KZ^{-1}LZ^{-1})$, where $K,L\in S_n$.
Then $D^2f_Z(K,K)=tr((KZ^{-1})^2)$. Since $Z^{-1}>0$ and $K\in S_n$, $KZ^{-1}$ has only real eigenvalues. Consequently $tr((KZ^{-1})^2)\geq 0$, $D^2f_Z(K,K)\geq 0$ and we are done. $\square$