Specific conditional entropy $H(X|Y=y)$ is not bounded by $H(X)$?

500 Views Asked by At

Suppose that $P(Y=y)>0$ so that

$$ H(X|Y=y)=-\sum_{x} p(x|y) \log_{2} p(x|y) $$

makes sense. I've assumed for a long time that $H(X|Y=y)\le H(X)$, but then it seems that the wiki article claims that $H(X|Y=y)$ is not necessarily bounded by $H(X)$.

It is a standard result that $H(X|Y)\le H(X)$, and it makes perfect sense (heuristically) since knowing some more information (Y) can reduce the uncertainty in $X$.

Then how is it possible that knowing additional information that $Y=y$ can actually increase the uncertainty in $X$?

Is there any nice example where $H(X|Y=y)>H(X)$?

1

There are 1 best solutions below

0
On BEST ANSWER

It's important to distinguish $H(X \mid Y)$ from $H(X \mid Y=y)$. The first is a number, the second is a function of $y$. The first is the average of the second -averaged over $p(y)$:

$$H(X \mid Y) = E_y \left[H(X \mid Y=y)\right]$$

Hence, beacuse it's an average, in general we'll have $H(X \mid Y=y)>H(X \mid Y)$ for some values of $y$ and $H(X \mid Y=y)<H(X \mid Y)$ for some others. Then, while it's true that $H(X)>H(X \mid Y)$, this does not imply $H(X)>H(X \mid Y=y)$ for all $y$

Example: let $(X,Y)$ take values on $(0,0)$, $(0,1)$ $(1,0)$ with equal probability ($p=1/3$). Then $H(X)=h(1/3) \approx 0.92 <1$ . But $H(X \mid Y=0)=h(1/2)=1$ (here $h(\cdot)$ is the binary entropy function).