Constrained parameters in least square curve fitting

652 Views Asked by At

I have some data points that need to be fit to the curve defined by

$$y(x)=\frac{k}{(x+a)^2} - b$$

I have considered that it can be done by the least squares method. However, the analytical solution gives me a negative $a$, so it puts the first point on the left branch of this hyperbola and I need all the points to fit to the right branch, thus $a$ must be positive. All my points have positive $x$ and $y$ is non-increasing.

Is there any way to add this type of constraint to analytical solution?

I would also kindly appreciate any links to related and/or useful information on iterative numerical solution. I need to program everything manually for my mobile app, so I can't use any external software or libraries.

3

There are 3 best solutions below

3
On BEST ANSWER

Supposing that the OP is looking for a conventional method of regression, the present answer would be not convenient. This is why I post it as a distinct answer.

The calculus below is ultra simple since there is no iteration and no need for initial guessed values.

enter image description here

Numerical example :

enter image description here

Result :

enter image description here

$\textbf{Comment :}$

At first sight, while comparing the approximate values of parameters $a,b,k$ , it seems that the present result is not close to the previous result obtained with a classical non-linear regression.

In fact the curves drawn on the respective graphs are almost undistinguishable. Moreover the respective standard deviations are close : $0.403$ for the first method compared to $0.441$ for the second.

This is an unexpected good result because in case of small number of points the numerical integration introduces additional deviations. (The numerical integration is involved in the computation of the $S_i$ ).

$\textbf{For information :}$

In this non-conventional method, instead of the fitting of the function where the parameters act non-linearly, one fit an integral equation where the same parameters act linearly. The original function is a solution of the integral equation. This transforms the non-linear regression into a linear regression. For more explanation and examples : https://fr.scribd.com/doc/14674814/Regressions-et-equations-integrales

In the present case a convenient integral equation is : $$ay+2bx-xy-\int y\;dx=\text{constant}$$ One observe that a parameter $c$ seems to disappear but is in fact hidden into the constant. That is why an additional linear regression is necessary to compute the missing parameter.

Of course the criteria of fitting is not the same in the conventional methods. If a specific criteria is specified in the wording of the problem, one cannot avoid a non-liner regression adapted for the specific criteria. In that case one can start the iterative process from the values of parameters provided by the above regression with integral equation. This avoids the uncertain search for good "guessed' values of parameters.

3
On

In any manner, your model is nonlinear with respect to parameters. So, why not to rewrite it as $$y(x)=\frac{k}{(x+\alpha^2)2} - b$$

6
On

I cannot see any difficulty with the your data points x = {0, 1, 2, 3}, y = {-23, -32, -38, -40}.

With least squares fitting my result is shown on the figure below. The computed value of $a$ is positive as expected.

If you obtain a negative $a$ or other aberrant results this is probably due to the software that you use.

Since the regression is non-linear, the usual softwares proceed with iterative calculus which requires initial values for the sought parameters. The computation of preliminary approximates of the parameters is the main weakness of the softwares. If the "guessed" starting values are not good enough the further iterative computation may lead to incorrect results.

Of course I cannot be sure that this is the true explanation of the trouble in your case of calculus without more information about the algorithm of your software, especially for the approximation of the starting values of the parameters.

enter image description here

IN ADDITION after the discussion in comments :

So, you want to write your own program. I suggest a simplified way for non-linear regression in case of the function $$y=\frac{k}{(x+a)^2}-b$$ Start with a guessed value $a=a_0$ . From the data $(x_k, y_k)$ compute a new data $(X_k,y_k)$ with $$X_k=\frac{1}{(x_k+a_0)^2}$$ Then make a linear regression for the unknown parameters $k,b$ with respect to the linear function $$y=kX-b$$ Compute a corrected value of $a_0$ and iterate the process.

Of course it is possible to proceed "by hand" with successive corrections of $a_0$ by trial and error but this should be tiresome.