How to prove $2d_H(\{XY\},\{X\}\{Y\})^2 \le I(X,Y)$?

55 Views Asked by At

Let $X$ and $Y$ be discrete random variables. Denote the joint distribution of $X$ and $Y$ by $\{XY\}$ and their marginal distributions by $\{X\}$ and $\{Y\}$. Let $\{X\}\{Y\}$ denote the product of their marginal distributions. How to prove $$2d_H(\{XY\},\{X\}\{Y\})^2 \le I(X,Y),$$ where $d_H(\cdot,\cdot)$ is the Hellinger distance of two distributions and $I(\cdot,\cdot)$ denotes their mutual information?

1

There are 1 best solutions below

0
On BEST ANSWER

(Essentially copied from here)

$$2 d_H(P,Q)^2 = \sum_i (\sqrt{p_i} - \sqrt{q_i})^2= \sum_i p_i+q_i - 2 \sqrt{p_i qi}=2(1- \sum_i \sqrt{p_i qi}) \tag{1}$$

But since $x-1 \ge \log x$

$$1- \sum_i \sqrt{p_i qi}\le - \log ( \sum_i \sqrt{p_i qi})= - \log E_p \sqrt {Q/P} \tag{2}$$

And by Jensen inequality $\log E(\sqrt {\circ})\ge E (\log \sqrt{\circ}) = \frac{1}{2}E (\log \circ ) $. Putting all together,

$$ 2 d_H(P,Q)^2 \le KL(P,Q)$$

where KL is the relative entropy or Kullback-Leibler distance. Replace $P$ by $\{X Y\}$ and $Q$ by $\{X\}\{Y\}$ and you are done.