Best fitting in a curve of the form $Ax^B+C$

961 Views Asked by At

I am trying to fit data of the form $(x_i,y_i)$, $i=1,\ldots,n$, in a curve of the form $y=Ax^B+C$, where $B\in (0,1)$. All three constants $A,B,C$ are to be determined optimally (no particular norm for the moment). Also, the $x_i$'s are positive integers, if that helps.

Is there any standard method to attack such a problem?

3

There are 3 best solutions below

0
On

The standard method is least-squares, which in your case is non-linear, requiring to resort to the Levenberg-Marquardt algorithm, with an extra difficulty as $B$ is constrained.

You can work out your particular case differently: see $B$ as an independent parameter. For a given value of $B$, you easily solve the linear fitting problem

$$y_i=Ax'_i+B$$ where $x'_i=x_i^B$, and compute the residual error $E(B)$.

Now you can use your preferred 1D minimizer (golden section) to find $\min_{B\in[0,1]}E(B)$.


Make sure to have a look at the curve of $E(B)$. Hopefully it will be unimodal (single maximum). It may turn out that the maximum lies outside the range, in which case the optimum is one of $B=0$ or $B=1$.


If your data set has outliers, you may need to use a robust method such as RANSAC. The latter will require to solve the exact fitting for three points.

You can eliminate $A$ and $C$ by

$$\frac{y_2-y_0}{y_1-y_0}=\frac{x_2^B-x_0^B}{x_1^B-x_0^B}=u$$ which is nonlinear in $B$, and can be solved by regula falsi or possibly Newton. From $B$, you derive $A$ and $C$ easily.


Final note:

If the dataset contains values such that $$\frac{x_2}{x_0}=\left(\frac{x_1}{x_0}\right)^2,$$ that is to say $$x_1=\sqrt{x_0x_2},$$ the previous equation simplifies to

$$\left(\frac{x_1}{x_0}\right)^B+1=u$$ which is directy solvable for $B$.

8
On

The function to fit isn't linear. Thus, the fitting requires a non-linear regression process.

The usual processes start with an initial guess of the parameters to be adjusted. Then the fitting is carried out thanks to an iterative process.

A non-conventional method which doesn't requires initial guess and which is not iterative is shown below :

enter image description here

Note : For the power exponent $c$, the criteria of fitting is also the least mean squares, but related with the antiderivative of $y(x)$ instead of $y(x)$ as usual. The final regression for the parameters $a$ and $b$ is standard (least mean squares).

This method comes from : https://fr.scribd.com/doc/14674814/Regressions-et-equations-integrales

Some conditions about the data to a good accuracy of the fit are discussed in the referenced paper.

6
On

As a "rough" approach (if I understood what you mean by no particular norm) which has the practical advantage that can be computed even on a spreadsheet, I would proceed as follows. Given that $$ \begin{array}{l} y = Ax^{\,B} + C \\ y - C = Ax^{\,B} \\ \log \left( {y - C} \right) = \log A + B\log x \\ \end{array} $$ I would :

  • put some tentative value for $C$,

  • perform the linear regression on the log formula, in case restricting the $B$ value

  • look for the value of $C$ which minimize the regression error.

Clearly this implies that the least square error is taken on $log(y-C)$, that is on

$$ \left| {\,\frac{{y_i - C}}{{y(x_i ) - C}} - 1\,} \right| = \left| {\,\frac{{y_i - y(x_i )}}{{y(x_i ) - C}}\,} \right| $$

The the above, on a spreadsheet Solver, traslates into:
Target Cell = Regr. Error to MIN
by changing Cell = C
subject to constraints 0 < B < 1 .