Is information entropy of a discrete random variable taking $n$ values maximized for a fair $n$-dice?

30 Views Asked by At

In other words, is the following true?

$$p_i > 0 \land \sum\limits_{i=1}^n p_i = 1 \implies \sum\limits_{i=1}^n p_i\log\frac{1}{p_i} \le \log{n}$$

I don't know how to tackle this, thought of Jensen's inequality or Cauchy Schwarz but the obvious applications work in the wrong direction (would get a lower bound on LHS in terms of $p_i$).

1

There are 1 best solutions below

1
On BEST ANSWER

See that $\log()$ is concave and hence using Jensen's inequality you get: $$ \sum_{i}p_i\log\frac{1}{p_i}\leq\log(\sum_{i}\frac{p_i}{p_i})=\log n. $$