Find the maximum of $(x_1\cdots x_n)^2$ subject to $x_1^2+\ldots +x_n^2=1$

458 Views Asked by At

Find the maximum of $(x_1\cdots x_n)^2$ subject to $x_1^2+\ldots +x_n^2=1$ und show that $$\left (\prod_{k=1}^na_k\right )^{\frac{1}{n}}\leq \frac{1}{n}\sum_{k=1}^na_k$$

For the first part I applied the method of Lagrange multipliers.

We have the function $f(x_1, \ldots , x_n)=(x_1\cdot \ldots \cdot x_n)^2$ and the constraint $g(x,y,z)=x_1^2+ \ldots + x_n^2-1=0$.

The Lagrange function is \begin{equation*}L(x_1, \ldots , x_n ,\lambda )=f(x_1, \ldots , x_n)+\lambda g(x_1, \ldots , x_n)=(x_1\cdot \ldots \cdot x_n)^2+\lambda \left (x_1^2+ \ldots + x_n^2-1\right )=\left (\prod_{j=1}^nx_j\right )^2+\lambda \left (\sum_{j=1}^nx_j^2-1\right )\end{equation*} We calculate the partial derivatives of $L$ : \begin{align*}&\frac{\partial}{\partial{x_i}}(x_1, \ldots , x_n ,\lambda )=2\left (\prod_{j=1}^nx_j\right )\cdot \left (\prod_{j=1, j\neq i}^nx_j\right ) +2\lambda x_i \\ & \frac{\partial}{\partial{\lambda }}L(x_1, \ldots , x_n ,\lambda )=\sum_{j=1}^nx_j^2-1 \end{align*} with $1\leq i\leq n$.

To get the extrema we set the partial derivatives equal to zero.

Then we get the following system: \begin{align*}&2\left (\prod_{j=1}^nx_j\right )\cdot \left (\prod_{j=1, j\neq i}^nx_j\right ) +2\lambda x_i =0 \Rightarrow x_i\cdot \prod_{j=1, j\neq i}^nx_j^2 +\lambda x_i =0 \Rightarrow x_i\cdot \left (\prod_{j=1, j\neq i}^nx_j^2 +\lambda \right ) =0 \\ & \sum_{j=1}^nx_j^2-1=0 \end{align*}

How can we continue?

4

There are 4 best solutions below

8
On BEST ANSWER

Obviously, $x_i \ne 0$. Multiply each of your first $n$ equations by $x_i$ and add them up: $$ n\prod_{j=1}^nx_j^2+\lambda\sum_{i=1}^nx_i^2 = 0, $$ $$ n\prod_{j=1}^nx_j^2+\lambda= 0. $$ It follows that $$ \lambda = -n\prod_{j=1}^nx_j^2, \quad \quad x_i = \pm \frac{1}{\sqrt{n}}. $$

1
On

Lagrange multipliers are just asking (when gradients of both constraint and objective functions are nonzero) where they are parallel. This is simply, in the case that all partials are nonzero, asking for points where the ratios of the partials all come out the same.

If any element is zero, the product is zero. Otherwise, we can demand all $x_i > 0$ and continue. The $x_i$ partial of $P^2$ is $$2 \frac{P^2}{x_i}$$ where $P$ is the product of the $x_i$

The $x_i$ partial of the sum of squares is $$ 2 x_i$$

Each ratio is $$ \frac{P^2}{x_i^2}$$ Lagrange multipliers demand all these ratios equal, meaning all the $x_i^2$ must be equal. So, each $$ x_i = \frac{1}{\sqrt n} $$

1
On

Just to provide a non-calculus alternative.

We can show that the maximum of $(x_1x_2\cdots x_n)^2$ subject to $x_1^2 + \cdots +x_n^2 =1$ must occur when $x_1^2 = \cdots = x_n^2$ by using AM-GM for two non-negatives (which is relatively easy to show):

AM-GM for two non-neagives: If $A,B\ge 0$, then $(A+B)/2 \ge \sqrt{AB}$ and equality holds if and only if $A=B$. (Proof of this later, or try it yourself.)

Now we prove that this maximum occurs when $x_1^2 =\cdots = x_n^2$. Suppose to the contrary, say $x_1^2\neq x_2^2$. Denote the maximum value as $M$ where $$M = (x_1^2 x_2^2)(x_3\cdots x_n)^2.$$ Now $x_1^2 + x_2 ^2 +\cdots+ x_n^2 = 1$, so $x_1^2 + x_2 ^2 = 1-Q$ for some $Q$. I claim we can pick better values for $x_1$ and $x_2$ that improves the maximum. Indeed, set $y_1 = y_2 = \sqrt{\frac{1-Q}{2}}$. Then note $y_1 ^2 + y_2 ^2 = 1-Q$, so $$y_1^2 +y_2 ^2 + x_3 ^2 +\cdots + x_n^2 = 1.$$ And by AM-GM for two non-negatives, we have $$x_1^2x_2^2 < \left(\frac{x_1^2+x_2^2}{2}\right)^2 = \left(\frac{1-Q}{2}\right)^2 = y_1^2y_2^2,$$ and note this inequality is strict as $x_1^2\neq x_2^2$.

Hence $y_1^2y_2^2(x_3\cdots x_n)^2> M$, contradicting the maximality of $M$.

Thus we conclude $x_i^2$ are all the same values when achieving maximum and hence $x_i^2 = 1/n$. $\blacksquare$


Proof of AM-GM for two non-negatives.

The inequality is straightforward as $0\le (\sqrt A - \sqrt B)^2 = A + B -2\sqrt{AB} $, so $ \sqrt{AB}\le (A+B)/2$. For the equality case, note $ \sqrt{AB} = (A+B)/2$ if and only if $(\sqrt A - \sqrt B)^2 = 0 $, if and only if $A=B$. $\blacksquare$


Remark. Of course we will have to believe that $(x_1x_2\cdots x_n)^2$ will attain a maximum somewhere on the unit sphere to begin with, which is true as $f(x_i)=(x_1\cdots x_n)^2$ is continuous on the compact unit sphere (extreme value theorem). This will be trickier to justify, a non-analysis proof eludes me at the moment.

Remark 2. It isn't too surprising we involve AM-GM here, as you may observe that the corollary of this problem is to demonstrate the general AM-GM inequality for $n$ non-negatives.

0
On

A bit late but I thought worth mentioning it.

Set $y_k = x_k^2$ for $k=1\ldots n$. So, to maximize is

$$\prod_{k=1}^n y_k \text{ subject to } \sum_{k=1}^ny_k =1\text{ with } y_1,\ldots ,y_n \geq 0$$

Since the product is $0$ if any of the $y_k$ is zero, we can assume $y_1,\ldots ,y_n > 0$.

Now, Jensen (or concavity of $\ln$) gives immediately

$$\sum_{k=1}^n \ln y_k \leq n \ln \sum_{k=1}^n \frac{y_k}n = \ln \frac 1{n^n}$$

Hence,

$$\prod_{k=1}^n x_k^2 = \prod_{k=1}^n y_k \leq \frac 1{n^n}$$

and equality is reached for $y_k = \frac 1n \Leftrightarrow x_k =\pm \frac 1{\sqrt n}$.