Entropy increase when conditioning

154 Views Asked by At

I know that a property of conditional entropy is that it cannot increase the entropy of the conditioning variable, however it seems to happen for me in the following example. Can anyone help me out where I am wrong?

Assume two discrete random variables $X$ and $Y$. The pmf of $X$ is $p_X(x) = 1/8$.

The entropy of $X$ must then be (assuming $\log = \log_2$), $$H(X) = -\sum_xp(x)\log(p(x)) = \log(8) = 3$$ We assume that $y \in \{1,2\} $ and $p_Y(y) = 1/2$ and $p_{XY}(x|y) = 1/4$.

The conditional entropy is then

\begin{align*} H(X|Y) &= -\sum_yp(y) \sum_x p(x|y) \log(p(x|y))\\ &= -\left(2\cdot\left(\frac{1}{2} \cdot \left(8 \cdot \frac{1}{4} \cdot \log\left( \frac{1}{4}\right)\right)\right)\right)\\ &= 4 \end{align*}

Which then clearly is wrong. Intuitively, I would assume that it would be $2$ rather than $4$.