proving $\frac{1}{n}$ maximises entropy

173 Views Asked by At

With $n$ being the cardinality of $X$. We can use Jensen's inequality to show that the maximum of the entropy function as such: $$ \begin{align} H(X) &= -\sum_{i=1}^np_i log(p_i) \\ &= \sum_{i=1}^np_i log(\frac{1}{p_i}) \\ &= \mathbb{E}[log(x)] \\ &\leq log(\mathbb{E}[x]) \\ &\leq log \sum_{i=1}^np_i \frac{1}{p_i} \\ &\leq log(n) \end{align} $$

How do we prove that $p_i = \frac{1}{n}$ achieves the maximum ?

More generally how would one go about proving things when there are series involved ?

1

There are 1 best solutions below

2
On

In this case, why not just compute the sum and show, that it hits the upper bound?: $$ \sum{p_i}\log{\frac{1}{p_i}} = \frac{\sum{\log{\frac{1}{\frac{1}{n}}}}}{n} = \frac{n\cdot \log{n}}{n} =\log{n} $$ I don't think, there is a general answer to your second question. Sometimes you compute the sum, sometimes you show boundary conditions, etc. Depends on the specific question, preconditions, involved expressions, etc.