Minimizing sample variance of $n$ functions

83 Views Asked by At

$f_n$, $i=1,\dots, n$ are $n$ functions. I would like to minimize the sample variance of these functions subject to a linear constraint:

$$\text{minimize}\quad \frac{1}{N}\sum (f_i(x_i) - f_\text{mean}(x))^2\\ \text{subject to}\quad \sum x_i = R$$

where $f_\text{mean}(x) = \frac{1}{N}\sum f_i(x_i)$.

How do I solve this problem? When I use the Lagrange Multipliers theorem, the partial derivates of the objective are zero, implying $\lambda = 0$. What is going on?