For a discrete random variable $x$, the cross entropy is $$H(x) = -(p_1\log p_1+\cdots+p_n\log p_n),$$ so what is the maximum of $H(x)$?
Here is what I tried, I compute the gradient as follows $$\nabla_pH = -\begin{pmatrix}\log p_1+1\\ \vdots\\ \log p_n+1\end{pmatrix},$$ and the Hessian matrix $\nabla_p^2H$ could also be computed, which is negative semi-definite, right? So $H(x)$ has one global maximum.
By setting $\nabla_pH = 0$, I could find out that the maximum is achieved at $p_i = 0.5$. However, since $\sum p_i = 1$, so this can't be true, how could I find when is the maximum is achieved in this case?
UPDATE
As @Semiclassical pointed out, I tried Lagrangian multiplier, here it is, $$L(p_1,\cdots,p_n, \lambda) = H(x) + \lambda(\sum p_i - 1),$$ then set partial derivative of $L$ over $p_i$ to 0, $$\nabla_{p_i}L = -\log p_i-1+\lambda = 0.$$ Thank to @Semiclassical, I indeed made a mistake not just a typo with the partial derivative over $p_i$, after fixing it, now I could solve the equation, $$p_i = e^{\lambda-1},$$ so all the variables $p_1,\cdots,p_n$ shall have the same value to achieve maximum for $H(x)$, right?
$f(x)=x\ln x$ is convex . $\displaystyle\sum_{k=1}^np_k\ln p_k=n\sum_{k=1}^n\frac{1}{n}f(p_k)\geq nf(\frac{1}{n}\sum_{k=1}^np_k) =-\ln n$ by Jensen inequality