How to prove the following entropy formula?

1.2k Views Asked by At

Could anyone show me a proof or redirect to a source where the following entropy equation is proved? =)

$$H(X,Y|Z)=H(X|Z)+H(Y|X,Z)$$

Thank you!

1

There are 1 best solutions below

2
On BEST ANSWER

The closely-related identity $H(X,Y) = H(X) + H(Y|X)$ should certainly be in any text on information theory (even if the one you ask for specifically is not).

But let's do it anyway, step-by-step. I'll assume $X$, $Y$, and $Z$ are all discrete variables. The brute-force way of proving such a relation is to break down the definition of the entropy. It's pretty mechanical so long as you recall that $p(x,y|z) = p(y|x,z)p(x|z)$.

$\begin{align*} H(X,Y|Z) &= -\sum_{x,y} p(x,y|z) \log p(x,y|z) \\ &= -\sum_{x,y} p(x,y|z) \log p(y|x,z) - \sum_{x,y} p(x,y|z) \log p(x|z) \\ &= -\sum_{x,y} p(x,y|z) \log p(y|x,z) - \sum_{x} p(x|z) \log p(x|z) \\ &= -\sum_{x,y} p(y|x,z)p(x|z) \log p(y|x,z) + H(X|Z) \\ &= -\sum_{x} \left[\sum_{y} p(y|x,z) \log p(y|x,z)\right]p(x|z) + H(X|Z) \\ &= \sum_{x} H(Y|X=x,Z)p(x|z) + H(X|Z) \\ &= H(Y|X,Z) + H(X|Z) \end{align*} $