Minimize $\sum a_i^2 \sigma^2$ subject to $\sum a_i = 1$

4.7k Views Asked by At

$$\min_{a_i} \sum_{i=1}^{n} {a_i}^2 \sigma^2\text{ such that }\sum_{i=1}^{n}a_i=1$$ and $\sigma^2$ is a scalar.

The answer is $a_i=\frac{1}{n}$.
I tried Lagrangian method. How can I get that answer?

3

There are 3 best solutions below

8
On BEST ANSWER

Use Lagrange multiplier as you tried.

$$ \begin{align*} f(a_1, \cdots, a_n) &= \sigma^2 \sum a_i^2 \\ g(a_1, \cdots, a_n) &= 1 - \sum a_i \\ F(a_1, \cdots, a_n; \lambda) &= f - \lambda g \end{align*} $$

Partial derivatives are $$ \begin{align*} \frac{\partial F}{\partial a_j} &= 2 \sigma^2 a_j - \lambda \\ \frac{\partial F}{\partial \lambda} &= -1 + \sum a_i. \end{align*} $$ From the constraint condition, $\frac{\partial g}{\partial a_i} = -a_i \neq 0$ for some $i$. So Lagrange multiplier guarantees those derivatives must be zero if $a_1, \cdots, a_n$ attain a minimum. From the $\frac{\partial F}{\partial a_j} = 0$, we get $a_j = \frac{\lambda}{2 \sigma^2}$. And from $\frac{\partial F}{\partial \lambda} = 0$, we get $\lambda = \frac{2 \sigma^2}{n}$ and hence $a_j = \frac{1}{n}$.

Let's check this gives us a minimum value indeed. If we "move" the point a little, that is, if we put $\alpha_i = \varepsilon_i + 1/n$ where $\sum \varepsilon_i = 0$, then it satisfies $g(\alpha_1, \cdots, \alpha_n) = 0$ but $$ f(\alpha_1, \cdots, \alpha_n) = f\big(\frac{1}{n}, \cdots, \frac{1}{n}\big) + \sigma^2 \sum \varepsilon_i^2 \\ \geq f\big(\frac{1}{n}, \cdots, \frac{1}{n}\big) = \frac{\sigma^2}{n} $$ so $a_j = 1/n$ indeed attain a minimum.

2
On

$\displaystyle \sum_{i=1}^{n}(x-a_i)^2\ge0,\forall x\in \mathbb{R}$

$\displaystyle \Rightarrow \sum_{i=1}^{n}(x^2+a_i^2-2xa_i)\ge0$

$\displaystyle \Rightarrow nx^2+\sum_{i=1}^{n}a_i^2-2x\sum_{i=1}^{n}a_i\ge0$

Now we have a quadratic in $x$ which is always grater than equal to zero which implies that the quadratic can have either two equal roots in real nos. or has both complex roots.This implies that the discriminant is less than or equal to zero.

Discriminant $=\displaystyle D=4\left(\sum_{i=1}^{n}a_i\right)^2-4n\sum_{i=1}^{n}a_i^2\le0$

$\displaystyle \Rightarrow\left(\sum_{i=1}^{n}a_i\right)^2-n\sum_{i=1}^{n}a_i^2\le0$

$\displaystyle \Rightarrow 1-n\sum_{i=1}^{n}a_i^2\le 0$

$\displaystyle \Rightarrow \frac{1}{n}\le \sum_{i=1}^{n}a_i^2$

Equality holds if the equation has equal real root

But then $\displaystyle \sum_{i=1}^{n}(x-a_i)^2=0$ for some $x\in R$

$\Rightarrow x=a_i,\forall 1\le i\le n$

$\Rightarrow \sum _{i=1}^{n}x=\sum _{i=1}^{n}a_i=1$

$\Rightarrow x=a_i=\frac{1}{n},\forall 1\le i\le n$

Now as $\sigma^2\ge 0$ so the min. of $\sum_{i=1}^{n}a_i^2\sigma^2$ is attained when $\sum_{i=1}^{n}a_i^2$ is also minimum.

I think this is a much better and elementary solution than solving it using Lagrange multipliers.

0
On

I think you could approach this problem with the Cauchy-Schwarz inequality.

Using the Cauchy-Schwarz inequality, and the condition for it to be an equality you'll conclude that for the minimum value of $\sum{{a_i}^2 {\sigma}^2}$ is attained iff $a_i\sigma=\lambda$ for some $\lambda$ which is a constant. Since $\sigma$ is not equal to 0, this implies $a_i={\lambda}/{\sigma}^2$. This along with the constraint $\sum{a_i}=1$ yields $\lambda={{\sigma}^2}/n$ so that $a_i=1/n$