Conditional Entropy and Conditional Expectation of Gaussian

90 Views Asked by At

I'm reading a paper and find the following inequality: \begin{align} I(X;\hat{X}) &= h(X) - h(X|\hat{X})\\ &\geq h(X) - h(X-\mathbb E[X|\hat{X}])\\ &\cdots\cdots \end{align} where $I(X;\hat{X})$ is the mutual information, $h(X)$ is the differential entropy, and $X$ and $\hat{X}$ are two Gaussian random variables. I'm confused as to why the inequality holds and I didn't find relevant theorems in Information Theory textbook. Could anyone give me some hints? Thanks!