An inequality on mutual entropy

45 Views Asked by At

The question is as follows:

Prove $H(X,Y,Z) - H(X,Y) \le H(X,Z)-H(X)$.

Here, I tried to prove instead

$$H(X,Z) - H(X) + H(X,Y)-H(X,Y,Z) \ge 0$$

I know that $H(X,Y,Z) = H(X,Y) + H(Z|X,Y)\ $ and $H(X,Z) = H(X)+H(Z|X)$. So if we eliminate the same terms, we get

$$H(X,Z) - H(X) + H(X,Y)-H(X,Y,Z) = H(Z|X)-H(Z|X,Y)$$

From here, I intuitively can say that $H(Z|X)-H(Z|X,Y) \ge 0$ because when we are given joint distribution of $X$ and $Y$, we have less randomness compared to given distribution of only $X$. But I couldn't prove it mathematically (by manipulating the expression farther in other words). So any help will be appreciated. Thank you in advance.

1

There are 1 best solutions below

0
On

According to https://en.wikipedia.org/wiki/Conditional_mutual_information, $H(Z|X)-H(Z|X,Y)=I(X,Z|Y)$ which is $\geq 0$ due to nonnegativity of mutual information.