How can one prove that the entropy of random variable $X$ is upper bounded by $\log|X|$?
I tried the following $$H(x) = - \sum_x p(x)\log p(x)$$ $$ \leq - \sum_x p(x)\sum_x\log p(x)$$ $$= - \sum_x\log p(x)$$
But that doesn't get me to the proof. Does anyone know how to continue?
Prove that $H(x)$ is maximum when $p(x)=1/|X|$. You can use Lagrange multipliers for example, with the constraint $\sum_x p(x)=1$. Thus,
$$H(x)\leq -\sum_x \frac{1}{|X|}\log(1/|X|)=\log(|X|)$$