Looking for the name of a theorem relating Shannon entropy to largest probability

57 Views Asked by At

Based on the definition of the Shannon entropy, $$ H = \sum_x p(x) \log \frac{1}{p(x)} $$ for a probability distribution $p$ defined on a discrete set of outcomes $x$, one can bound the probability of any event $$ \forall x, p(x) \le e^{-H}, $$

using the fact that $\log \frac{1}{p(x)} \ge \log \frac{1}{\max_x' p(x')}$ for any $x$. Is there a name for this inequality?

1

There are 1 best solutions below

1
On

The theorem is false. What we have, calling $p_M$ the maximum over $p_x$, and using nats, is

$$ H = \sum_x p_x \log (1/p_x) \ge \sum_x p_x \log (1/p_M) = \log(1/p_M) $$

Hence $-H \le \log p_M$ and $p_M \ge e^{-H} $. So, you got the inequality backwards - and it doesn't let you conclude anything about the other $p_x$ values.