Proving an Entropy inequality

119 Views Asked by At

I wish to show that $H(X | Y, Z) \leq H(X | Y )$ for random variables $X,Y,Z$, where $H$ is the Shannon entropy. I know how to show $H(X | Y) \leq H(X)$ by using the Chain Rule on $H(X,Y)\leq H(X)+H(Y).$ I want to do something similar with $H(X,Y,Z)\leq H(X)+H(Y)+H(Z)$, but I'm stuck.

1

There are 1 best solutions below

0
On

The inequality in the "conditional case" follows the exact same steps as in the "unconditional case" since the same equations and inequalities hold, however, with all quantities involved conditioned on $Z$. In particular, it holds $$ H(X,Y|Z) = H(X|Y,Z)+H(Y|Z) \text{ (chain rule)} $$ and $$ H(X,Y|Z)\leq H(X|Z)+H(Y|Z) \text{ (conditional independence maximizes joint entropy)} $$

You may want to verify these formulas using the same mechanics as in the unconditional case, however, with the conditional version of the joint distribution of $X$ and $Y$ (that is, given $Z$) in place of the unconditional version.