Nonlinear regression with correlated errors

309 Views Asked by At

it's my first post here and I'm a newbie in statistics, so please forgive me if I'm doing something wrong or explaining myself badly.

Anyway, I have a problem similar to this: How to perform nonlinear regression with correlated errors?, i.e. a nonlinear least squares problem.

My points are the mean of 50 sets of data and are fitted well by "half" a parabola (something like $x^2 - 1/2$, with $x \in [-1,0]$), but the errors are very correlated or anti-correlated at all distances.

More precisely, close points are correlated, while far-away points are anti-correlated. I'm not sure about this, but it looks like for some reason the integral of the curve might be fixed (given the same set, if points are above the mean in the first half, they are below in the other and vice-versa).

The problem is, I have no idea how to handle errors in this situation.

I hope I have been clear and would appreciate any suggestion on the subject. Thank you very much in advance.

1

There are 1 best solutions below

2
On

This actually a linear regression problem because the model is linear in the coefficients c$_i$ even though it is quadratic in the variable x. The assumption in linear regression is that the residuals are uncorrelated (and normally distributed if least squares is to be considered optimal). Uncorrelated is important because it makes sense to minimize the sum of squared residuals if they are uncorrelated. In your case at points close in the x space there is high positive correlation and when they are far apart high negative correlation. In some instances the correlation structure can be modeled and a curve fit can be made based on that structure. But here you seem to have a very extreme case that is hard to characterize and is peculiar. I do not think there is a good way to handle this.