Calculus-based proof that $ x_1^{p_1}\cdots x_n^{p_n}\le p_1x_1+\dots+p_nx_n$ when $\sum p_i=1$

123 Views Asked by At

Let

$$g(x_1...x_n)=x_1^{p_1}\cdot...x_n^{p_n}$$ $$u(x_1...x_n)=p_1x_1+...p_nx_n$$

Where $\sum p_i = 1$.

I have to show that $f(x)=g(x)-u(x)$ is always negative or $0$ over $\Bbb R_+^n$. I've already shown that $f$ has critical points all the way along the diagonal $\Bbb R(1, ... 1)$, that it is $0$ at each of these critical points, and that its Hessian at each of these points is negative definite, so I know that they're all local maximums. But to conclude what I need, I have to show that they're also global maxima.

How can I do this? I know there are no other maxima (because there are no other critical points), but the function might tend to something positive at infinity, for example. Should I study the behavior of $f(x)$ as $|x|\to\infty$? How would I approach that for a multivariable function like this?

1

There are 1 best solutions below

0
On

The function $f$ is homogeneous: $f(tx_1,\dots,tx_n)=tf(x_1,\dots,x_n)$. Thus, it suffices to show $f\le 0$ on the intersection of $\mathbb R^n_+$ with the unit sphere. This turns into a constrained maximization problem. Since the intersection is a compact set, the maximum is attained somewhere. On the boundary of this spherical piece we have $g=0$ (because one of coordinates vanishes), hence $f<0$. To rule out critical points other than the one you know, you can work with Lagrange multiplier, which leads to $\nabla f(x)=\lambda x$.

That said... I think the above is doing it the hard way. I would rather apply Jensen's inequality to the concave function $t\mapsto \log t$.