Help with Lagrange multiplier in to prove Geometric-Arithmetic Inequality

391 Views Asked by At

I have to prove the inequality of inequality of arithmetic and geometric means using the Lagrange multipliers. The book I am using (Wendell Fleming) give me the next instructions: $$f:\mathbb{R}^n \to \mathbb{R} $$ If $x = x^1x^2x^3...x^n$

$$f(x)=x^1·x^2·...·x^n ; M=\{(x): x^1+x^2+...x^n=1; x^i>0, \forall i\}$$ -Show that $f(x) \leq n^{-n}$ with equality if $x^1=x^2=...x^n=n^{-1} $ If I prove that, the exercise is easy, but it also give a hint.

[Hint]=First prove that $f(x)$ has an absolute maximum on M. Apply the multiplier rule for $log(f)$ which has a maximum at the same point where $f$ has.

My problem is that I dont understand why the hint works, if i prove it, why it implies that $f(x) \leq n^{-n}$. And why $log (f)$ has a maximum at the same point that f.

Thanks if you can help me with this.

2

There are 2 best solutions below

5
On

We need to prove that $$\frac{x_1+x_2+...+x_n}{n}\geq \sqrt[n]{x_1x_2...x_n}$$ for non-negatives $x_i$.

Indeed, for $x_1x_2...x_n=0$ our inequality is obvious

and since our inequality is homogeneous, we can assume that $x_1x_2...x_n=1.$

Indeed, let $x_1x_2...x_n=k$, where $k>0$.

Now, let $x_i'=\sqrt[n]kx_i$ for all our $i$ and we obtain $x_1'x_2'...x_n'=1$,

but the inequality is not changed: $$\frac{x_1'+x_2'+...+x_n'}{n}\geq \sqrt[n]{x_1'x_2'...x_n'}.$$

Thus, we can rewrite $x_i'$ like $x_i$ again and we need to prove that $$x_1+x_2+...+x_n\geq n.$$ Let $$f(x_1,x_2,...,x_n,\lambda)=x_1+x_2+...+x_n-n+\lambda\left(\prod\limits_{i=1}^nx_i-1\right).$$ The domain of $f$ is a compact $C$, where $$C=\{(x_1,x_2,...,x_n,\lambda)|x_i\geq0,\lambda\in\mathbb R\}$$ and $f$ is a continuous function on $C$.

Thus, $f$ gets on $C$ the minimal value, which happens on the boundary of $C$, which is true,

or when for all $i$ $$\frac{\partial f}{\partial x_i}=0.$$ In this case we obtain $$1+\lambda\prod\limits_{k\neq i}x_k=0$$ or $$x_i+\lambda\prod_{i=1}^nx_i=0,$$ which gives $x_1=x_2=...=x_n$ and since $\prod\limits_{i=1}^nx_i=1$, we got an unique critical point: $$(1,1,...,1),$$ for which our inequality is obviously true.

Done!

0
On

For a given $$ \sum_{k=1}^nx_k\tag1 $$ we wish to maximize $$ \sum_{k=1}^n\log(x_k)\tag2 $$ That is, for every variation so that $$ \sum_{k=1}^n\color{#C00}{1}\,\delta x_k=0\tag3 $$ we want to have $$ \sum_{k=1}^n\color{#C00}{\frac1{x_k}}\,\delta x_k=0\tag4 $$ $\color{#C00}{\frac1{x_k}}$ is orthogonal to all variations that $\color{#C00}{1}$ is. That means that they are parallel. That is, there is a $\lambda$ so that $$ \color{#C00}{\frac1{x_k}}=\lambda\cdot\color{#C00}{1}\tag5 $$ This means that to maximize $(2)$ under the constancy of $(1)$, we need $x_k=\frac1\lambda$ (a constant).

In all boundary cases, the product is $0$, so we know the maximum is an interior critical point.