Proof for $H < 1$ (bits) $\implies$ $p_i > 0.5$ for some $i$

26 Views Asked by At

I am trying to prove that if the entropy ($H$) of a random variable with $n$ possible outcomes is less than one bit, one of the outcomes must have probability larger than 0.5. Intuitively I think that this is the case, and I manually checked for several possible distributions of the random variable. For $n=2$, this can be easily seen in the graph of the entropy. However, I couldn't prove it analytically for an arbitrary $n$ and could not find any proof online.

1

There are 1 best solutions below

1
On BEST ANSWER

We'll prove the contrapositive - if $p_i \le 0.5$ for all $i$, then $H(p_1, \ldots, p_n) \ge 1.$

Say that you have $p_1, \ldots, p_n$ with $p_1 + \cdots + p_n = 1$ and $p_i \le 0.5$ for all $i$. Then you have

$$ H(p_1, \ldots, p_n) = \sum_i -p_i \log_2 p_i $$

and since $p_i \le 0.5$, you have $-\log_2 p_i \ge -\log_2 0.5$, since the function $f(x) = -\log_2 x$ is decreasing. So you get

$$ H(p_1, \ldots, p_n) = \sum_i -p_i \log_2 p_i \ge \sum_i -p_i \log_2 0.5 = -\log_2 0.5 \sum_i p_i = \log_2 2 = 1.$$

Informally, the entropy $H(p_1, \ldots, p_n)$ is a weighted average of the information content of the event $i$, which is $-\log_2 p_i$. If $p_i \le 0.5$ for all $i$ then the information content of each event is at least one bit and so the entropy is at least one bit.