Entropy: Show $ H(X|Y) + H(Y|Z) \geq H(X|Z) $

435 Views Asked by At

I got following exercise: Let $X$; $Y$; $Z$ be random variables.

Show that $H(X \mid Y) + H(Y \mid Z) \geq H(X \mid Z)$.

Hint: consider $H(X, Y \mid Z)$.

We showed in the lecture that $H(X,Y \mid Z) = H(X \mid Z) + H(Y \mid X,Z)$.

Using chain rule $$ H(X_1, \dots , X_n) = \sum_{k=1}^n H(X_k \mid X_{k-1}, \dots , X_1) $$ I can further deduce:

$H(X, Y \mid Z) = H(X) + H(Y \mid X, Z)$ and since entropy is permutation invariant $H(Y \mid Z, X) = H(Y \mid Z) + H(X \mid Y;Z)$

Since I do not know of any Independence, I cannot assume any Entropy could be $0$. So I think I need to use one of following two Lemmas of the Information Inequality Theorem:

a) $H(X \mid Y) \leq H(X)$

b) $H(X_1, \dots, X_n) \leq H(X_1) + \dots + H(X_n)$

However, I cannot see where I can make use of that to proof the given statement.

1

There are 1 best solutions below

1
On BEST ANSWER

Qualitatively, LHS gives you both information on X given Y and information on Y given Z. RHS however doesn't divulge any information on Y. So, LHS gives you more information about the random variables. Quantitatively, we have

$H(X,Y|Z)=H(Y|Z)+H(X|Y,Z)$

We know that $H(X|Y)\geq H(X|Y,Z)$, i.e., information never hurts, so

$H(X|Y)+H(Y|Z) \geq H(X,Y|Z)$.

"Information never hurts" works both ways, i.e.,

$H(X,Y|Z)\geq H(X|Z)$. So we have

$H(X|Y)+H(Y|Z) \geq H(X|Y,Z)+H(Y|Z) = H(X,Y|Z) \geq H(X|Z)$