Entropy Inequality $H(X| g(Y))\geq H(X|Y)$

192 Views Asked by At

Let $X,Y$ be random discrete variables. $H(X) = -\sum\limits_{x}P\{X=x\}\operatorname{log}_2P\{X=x\}$ be the entropy-function. It is known fact that that $H(g(Y))\leq H(Y)$. I want to prove the following inequality, which seems to be obvious in terms of common-sense: $$ H(X|g(Y))\geq H(X|Y). $$ But formall proof is sufficiently bulky. Maybe somebody know an elegant proof of this fact.

1

There are 1 best solutions below

3
On BEST ANSWER

Use data processing inequality since $ X \to Y \to g(Y)$ \begin{align} I(X;Y) &\ge I(X;g(Y)) \\ H(X)-H(X|Y) &\ge H(X)-H(X|g(Y)), \text{ chain rule }\\ H(X|Y) &\le H(X|g(Y)) \end{align}