Mutual Information Entropy Inequality

86 Views Asked by At

I am trying to prove $H(x,y:z)>H(x:z)+H(y:z)$ and here is what I have.

LHS: $=H(xy)-H(xy|z)=-\Sigma p(xy)lg(p(xy))+\Sigma p(xy|z)lg( \frac{p(xyz)}{p(z)})$

RHS: =$H(x)+H(z)-H(xz)+H(y)+H(z)-H(yz)=-\Sigma p(x)lgp(x)-2\Sigma p(z)lgp(z)-\Sigma p(y)lgp(y)+\Sigma p(xz)lgp(xz)+\Sigma p(yz)lgp(yz)$

Does anybody have any tips on how to turn LHS>RHS?

1

There are 1 best solutions below

0
On

I don't think this is provable. $I[x, y : z]$ can be both smaller and greater than $I[x : z] + I[y : z]$.

Examples:

Let $x$, $y$, and $z$ be not necessarily independent random variables whose marginal distributions are all Bernoulli with $p = 0.5$. If $x = y = z$, then

$$I[x, y : z] = 1 < 2 = I[x : z] + I[y : z].$$

If $z$ and $x$ are independent and $y = z \oplus x$, where $\oplus$ is the exclusive or, then

$$I[x, y : z] = 1 > 0 = I[x : z] + I[y : z].$$