Relation between entropy of variable and entropy of conditioned variable

561 Views Asked by At

Let $X$ be a discrete random variable, and let $E$ be an event on the same probability space as $X$. Let $X_E$ be $X$ conditioned on the event $E$. Is there a general relationship between the Shannon entropy of $X$ and $X_E$?

I originally thought that one could say that $H(X_E) \geq H(X) - \log 1/\Pr(E)$, but this is not true: consider a $p$-biased coin $X$, and let $E$ denote the event that $X$ lands ``heads'' (which happens with probability $p$). Then, $H(X_E) = 0$, but $H(X) - \log 1/\Pr(E) = p\log(1/p) + (1 - p)\log(1/(1-p)) - \log(1/p) = (1 - p)\log p/(1 -p) > 0$, when $p > 1/2$.

Is there nothing we can say about the two quantities?