I have a random variable $X$ and want to show that having an entropy $$ H(X) = - \sum_{i=1}^n p_i \text{log}(p_i) = \text{log}(n)$$ is equivalent to the distribution of $X$ being uniform.
Starting with the distribution is straightforward but I don't see how I can deduce the other implication.
Consider $e^{H(x)} = \prod (\frac{1}{p_i})^{p_i}$
By weighted AM-GM Inequality
$\prod (\frac{1}{p_i})^{p_i} \leq \sum \frac{1}{p_i} \times p_i$ $\implies$ $\prod (\frac{1}{p_i})^{p_i} \leq n$
For the equality to hold: $\frac{1}{p_i} = \frac{1}{p_j}$ for any $i,j$ or equivalently $p_i = p_j$.
And hence $$p_i=\frac{1}{n}$$