conditional entropy proof

491 Views Asked by At

For random variables $X$ and $Y$, the conditional entropy of $Y$ given $X$is defined as $$H(Y|X) = - \sum_{x, y} p(x, y) \log p(y|x) = \sum_x p(x) H(Y|X=x)$$ where $$H(Y|X=x) = - \sum_y p(y|x) \log p(y|x)$$

note: $p(x, y) = p(x) \times p(y|x)$ can be used to prove the above equality.

Similarly, for $H(Y|X, Z)$, we write $$H(Y|X, Z) = \sum_z p(z) H(Y|X, Z=z)$$ where $$H(Y|X, Z=z) = - \sum_{x, y} p(x, y|z) \log p(y|x, z)$$

This is Definition 2.15 of chapter II from Information Theory and Network Coding book.

I get confused on how to prove $H(Y|X, Z)$.

1

There are 1 best solutions below

4
On BEST ANSWER

This looks simple, if you look at the first proof:

$$H(Y|X) = - \sum_{x, y} p(x, y) \log p(y|x) = - \sum_{x, y} p(x) p(y|x) \log p(y|x).$$

Performing the sum over $y$, we get:

$$H(Y|X) = \sum_x p(x) H(Y|X=x)$$

Similarly, for $H(Y|X,Z)$:

$$H(Y|X, Z) = \sum_{y,x,z} p(y,x,z) \log p(y|x,z) = \sum_{y,x,z} p(z) p(y,x|z) \log p(y|x,z)$$

performing sum over $y,x$:

$$H(Y|X, Z) = \sum_z p(z) H(Y|X, Z=z)$$

Hope this cleared your doubts. If not, let's discuss in this thread.