How are the variance of probabilities and the Shannon entropy related?

94 Views Asked by At

Is there a way to derive the expression for entropy from the variance (or mean squared dispersion) of probabilities, $$ \sigma_P^2=\frac{1}{N}\sum_{i=1}^N\left(p_i-\langle p\rangle\right)^2=\frac{1}{N}\sum_{i=1}^Np_i^2-\frac{1}{N^2},$$ where $p_i$ is the probability for the $i$th out of $N$ possible outcomes, or vice-versa?

In particular, I am interested in a relation such that the fact that the entropy is maximum for the uniform distribution becomes evident.