Consider a set of probabilities $\{p_1, ..., p_N\}$ where $p_1\leq p_2\leq ... \leq p_N$. Now consider the difference between the largest and smallest probabilities $D =p_N-p_1$. I want to know if one can use the Shannon entropy of the distribution to find a bound on $D$. The Shannon entropy is given by $H = -\sum_i p_i \log p_i$.
Some properties of $D$:
- $0\leq D\leq1$
- $D$ is minimized (ie. $D=0$) for a uniform distribution where $p_i = \frac{1}{N}$. This is the maximum entropy distribution.
- D is maximized (ie. $D=1$) for a distribution where $p_N=1$ and $p_i=0$ for $i\neq N$. This is the minimum entropy distribution.
Clearly, $D$ is not simply bounded by $H$, since $0\leq D\leq1$ and $0\leq H\leq \log(N)$. But they do seem to be related.
My question is: can we say anything about intermediate entropy distributions? Given the entropy of a distribution, can one find an upper (or lower) bound for $D$?