Consider the following function: $f(x_1, \dots, x_n) = p_1 \log x_1 + \dots + p_n\log x_n$ subject to the constraint that $\sum_i p_i = \sum_i x_i = 1$. It is also known that $p_i \in [0,1]$ and $x_i \in [0,1]$.
I need to prove formally that the function has global maximum when $\frac{p_1}{x_1} = \dots = \frac{p_n}{x_n}$, i.e, $x_i = p_i$ gives the global maximum.
I can prove this formally for two variables by using the standard first and second derivative test (eliminate one of the variables). I can also verify empirically that $x_i = p_i$ is the global maximum of this function when $n > 2$ but want to prove this formally.
Is there a suitable technique for multivariate functions that I can use?
Using the Lagrange multipliers method we obtain $$\nabla f=(\dfrac{p_1}{x_1},\cdots ,\dfrac{p_n}{x_n})=\lambda_1(1,1,\cdots,1)$$which means that $$\dfrac{p_1}{x_1}=\cdots =\dfrac{p_n}{x_n}=\lambda_1$$To complete our proof we need to show that the hessian is negative definite at this point. We have $$H=\begin{bmatrix}-\dfrac{p_1}{x_1^2}&0&\cdots&0\\0&-\dfrac{p_2}{x_2^2}&\cdots&0\\.\\.\\.\\0&0&\cdots&-\dfrac{p_n}{x_n^2}\end{bmatrix}$$which is negative definite (semi-definite) since $p>0$ ($p\ge 0$)