Proving certain aspects of Entropy

82 Views Asked by At

I am trying to prove three properties of entropy.

$1)$ $H(X|Y,Z)\le H(X|Y)$

$2)$ $H(X|Y,Z)\le H(X,Y)$

$3)$ $H(X,Y,Z)+H(Y)\le H(X,Y)+H(Y,Z)$

I have proved the third one, but it is based on part 1.

Should I break them down into summations or should I have to use basic properties of entropy such as the chain rule?

2

There are 2 best solutions below

8
On

There are many ways to do this. Probably the shortest argument is through definition of mutual information which is $\geq 0$. Then the first follows from the definition of conditional mutual information. $$ H(X|Y,Z) = H(X|Y) - I(X;Z|Y) $$ which implies (1) and (1) implies (2) by definition of chain rule.

Another way to show (1) is with chain rule inequality \begin{align} H(X_1,\ldots,X_n) = \sum_{i=1}^{n} H(X_i|X_{1},\ldots,X_{i-1}) \leq \sum_{i=1}^n H(X_i) \end{align} i.e. (you can think of this as sum of entropy given everything you seen before) $$ H(X,Y,Z)=H(X)+H(Y|X)+H(Z|Y,X)$$

and

\begin{align} H(X,Y|Z) &= E[-\log P(X,Y|Z)] \\ &= E[-\log P(X|Y,Z)P(Y|Z)] \\ &= E[-\log P(X|Y,Z)] + E[-\log P(Y|Z)] \\ &= H(X|Y,Z) + H(Y|Z) \end{align} Convince yourself that $$ H(X|Y,Z)=H(X,Y|Z)-H(Y|Z)=H(X,Z|Y)-H(Z|Y) $$

Then we have $$ H(X|Y,Z)=H(X,Z|Y)-H(Z|Y)\leq H(X|Y)+H(Z|Y)-H(Z|Y) = H(X|Y) \leq H(X,Y) $$

0
On

On the left side you have:

$$H(X,Y,Z) +H(Y)= 2 H(Y) + H(X,Z|Y)$$

And on the right:

$$H(X,Y) +H(Y,Z) = H(X|Y) +H(Y) + H(Z|Y) +H(Y) $$

But $H(X,Z|Y) \le H(X|Y) + H(Z|Y)$ - just a conditioned version of the property $H(A,B) \le H(A)+H(B)$