Finding the minimum of $E = \frac{1}{N}\sum_{i=1}^m \delta_i^2 t_i^{-1}$ using the Cauchy-Schwarz inequality?

42 Views Asked by At

Let $\delta_i,t_i > 0$ and $N\in \mathbb{N}$ for $i=1,\dots,m$. I read that the minimum of the following (error) quantity can be found using the Cauchy-Schwarz inequality. $$ E = \frac{1}{N}\sum_{i=1}^m \delta_i^2 t_i^{-1}. $$ The minimum happens at: $$ t_i = \frac{\delta_i}{\sum_{i=1}^m \delta_j}. $$ I don't how this was arrived at, does anyone know the method that was used?

1

There are 1 best solutions below

6
On BEST ANSWER

Hint : You can apply Cauchy-Schwartz inequality as follow

\begin{align*} \left( \sum_{i=1}^m \delta_i \right) E &=\frac{1}{N} \left( \sum_{i=1}^m (\delta_i^{1/2})^2 \right) \left(\sum_{i=1}^m \left(\delta_i t_i^{-1/2}\right)^2\right)\\ &\geq \frac{1}{N}\left(\sum_{i=1}^m \delta_i^{3/2} t^{-1/2} \right) \end{align*} with equality if and only if $\delta_i^{1/2}=c\delta_i t_i^{-1/2}$ for some $c$ and for all $i$.

To finish the proof, I make the assumption that you optimize subject to $\sum t_i=1$. Using $\delta_i^{1/2}=c\delta_i t_i^{-1/2}$ we see that $t_i=c^2 \delta_i$ which means that $c^2 \sum \delta_i = 1$ and so $c^2=\frac{1}{\sum \delta_i}$. Finally $t_i=\frac{\delta_i}{\sum_j \delta_j}$