Subadditivity of Entropy

803 Views Asked by At

We define $H(X) = -\sum_{x}p_{x}\log p_{x}$ and relative entropy as $H(p(x)||q(x)) = \sum_{x}p(x)\log \frac{p(x)}{q(x)} = -H(X)-\sum_{x}p(x)\log q(x).$

Now we have to prove that $H(X,Y,X) + H(Y) \leq H(X,Y) + H(Y,Z)$, with equality if and only if $X\rightarrow Y\rightarrow Z$ forms Markov chain.

1

There are 1 best solutions below

0
On

Well, I just was able to get an answer with lots of help. So I thought to publish.

We start by taking $H(p(x,y,z)||p(x||y,z)\cdot p(y,z))$. This gives $-H(x,y,z) - \sum_{xyz}p(x,y,z)\log p(x||y,z) - \sum_{xyz}p(y,z)log(p(y,z))$

$-H(X,Y,Z) - \sum_{xyz}p(x,y,z)\log p(x||y,z) +H(Y,Z)$

$-H(X,Y,Z) - H(X||Y) +H(Y,Z)$

$-H(X,Y,Z) - H(X||Y) +H(Y,Z)$

$-H(X,Y,Z) - H(X,Y) + H(Y) +H(Y,Z)$ <-- this should be greater than 0.

Thus, $H(X,Y,X) + H(Y) \leq H(X,Y) + H(Y,Z)$