Finding minimum of $x_1^p+\cdots+x_n^p$ subject to $x_1+\cdots+x_n=1$

133 Views Asked by At

I want to find minimum of $f=x_1^p+\cdots+x_n^p$ ($p>1$) subject to $g=x_1+\cdots+x_n=1$.

By Lagrange's multiplier, if it has a local extremum at $P$, it should satisfy $\nabla f(P)=\lambda \nabla g(P)$. I solved it and got $x_1=\cdots =x_n=1/n$. So $P=(1/n,\cdots,1/n)$ is a candidate for a minimum. But I don't know how to prove that $f$ has actually minimum at $P$. How can I show it?

4

There are 4 best solutions below

0
On

A simple method is to use the power mean inequality:

$$\sqrt[p]{\frac{x_1^p + \dots + x_n^p}{n}} \ge \frac{x_1 + \dots + x_n}{n} = \frac{1}{n},$$

with equality if and only if $x_1 = \dots = x_n$. This implies that the minimum is $n^{1-p}$.

0
On

From a strict convexity of $x^{p} $ for $p>1$

0
On

$\newcommand{\angles}[1]{\left\langle\, #1 \,\right\rangle} \newcommand{\braces}[1]{\left\lbrace\, #1 \,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\, #1 \,\right\rbrack} \newcommand{\ceil}[1]{\,\left\lceil\, #1 \,\right\rceil\,} \newcommand{\dd}{{\rm d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\dsc}[1]{\displaystyle{\color{red}{#1}}} \newcommand{\expo}[1]{\,{\rm e}^{#1}\,} \newcommand{\fermi}{\,{\rm f}} \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{{\rm i}} \newcommand{\iff}{\Longleftrightarrow} \newcommand{\imp}{\Longrightarrow} \newcommand{\Li}[1]{\,{\rm Li}_{#1}} \newcommand{\norm}[1]{\left\vert\left\vert\, #1\,\right\vert\right\vert} \newcommand{\pars}[1]{\left(\, #1 \,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}} \newcommand{\root}[2][]{\,\sqrt[#1]{\vphantom{\large A}\,#2\,}\,} \newcommand{\sech}{\,{\rm sech}} \newcommand{\sgn}{\,{\rm sgn}} \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert}$

'Close' to the extreme value you'll have:

\begin{align} &\color{#66f}{\large\sum_{i\ =\ 1}^{n}x_{i}^{p}}\sim n^{1 - p}\ +\ \overbrace{\sum_{i\ =\ 1}^{n}p\pars{1 \over n}^{p - 1}\pars{x_{i} - {1 \over n}}} ^{\ds{=\ \dsc{0}}}\ +\ \sum_{i\ =\ 1}^{n}\half\,p\pars{p- 1}\pars{1 \over n}^{p - 2} \pars{x_{i} - {1 \over n}}^{2} \\[5mm]&=n^{1 - p} +\half\,p\pars{p- 1}n^{2 - p}\sum_{i\ =\ 1}^{n} \pars{x_{i} - {1 \over n}}^{2}\ \color{#66f}{\large > n^{1 - p}} \quad\mbox{when}\quad\color{#66f}{\large% p < 0\phantom{A}\color{#000}{\small\mbox{or}}\phantom{A} p > 1} \end{align}

You got a minimum !!!.

0
On

As $f(x) = x^p $ is convex (for $p > 1$). We can use Jensen's inequality

$$ \frac{x_1^p+x^2_p+\cdots+x_n^p}{n} \geq \left(\frac{x_1+x_2+\cdots+x_n}{n}\right)^p$$

and hence it shows that the critical point you obtained using Lagrange multipliers is a minimum.