Prove that entropy is maximized when probability is $1/n$

505 Views Asked by At

How can be proven that the entropy of a dice roll is maximized when the probability of each of its $6$ faces is equal, $1/6$?

2

There are 2 best solutions below

1
On BEST ANSWER

Suprise is defined as -log(p{X=x}). A good way to think of entropy is the "expected surprise". In this sense, its easy to see that the uniform distribution maximizes the expected surprise.

1
On

The entropy is given by $-\sum p_i\ln p_i$. Use Jensen's inequality with the logarithm function.