Least Squares Sensitivity to data

391 Views Asked by At

Let ($x_1$,$y_1$),...,($x_n$,$y_n$) be my data set. I have a function $f(x,{\bf c})$ where ${\bf c}=(c_1,...,c_m)$ is a vector of $m$ parameters. I want to fit to the data using non-linear least squares :

$$\min_{\bf c}{\sum_{i=1}^n (y_i - f(x_i,{\bf c}))^2}$$

I want to find the sensitivity of my optimal parameters ${\bf c}^*$ with my data. In other words, how can I find

$$ \partial { c}_j^* \over \partial y_i $$

for $i=1,...,n$ and $j=1,...,m$.

In practice, I want to find the effect of perturbing my $y_i$'s by a small amount on the parameters ${\bf c}^*$ obtained.

Please give me some references and the name of this derivative. I am having a hard time finding information using Google on this subject. Also, please assume that my function $f$ is behaving well in term of differentiability (it is continuous and is differentiable multiple times).

Thank you very much for your help!

1

There are 1 best solutions below

1
On BEST ANSWER

In terms of the mathematical problem, you could try to use the implicit function theorem if your minimization problem is well behaved so that in the optimum: $$g({\bf c},y_i):=\frac{\partial\left(\sum_{i=1}^n (y_i - f(x_i,{\bf c}))^2\right)}{\partial c_j}=0$$ implies $$\min_{\bf c}{\sum_{i=1}^n (y_i - f(x_i,{\bf c}))^2}.$$

With the implicit function theorem, you can check how ${\bf c} $ has to change if $y_i$ changes so that $g({\bf c},y_i)$ remains zero, i.e., you remain at the optimal solution. The implicit function theorem states $$\frac{d c_j^*}{d y_i}=-\frac{\partial g/\partial y_i}{\partial g/\partial c^*_j},$$ which is what you want.

It might not work if your $f(.)$ function is not nicely behaved, so check the assumptions of the implicit function theorem. Also, the problem sounds important so I am sure there is a literature on this.