• Problem formulation
I have to fit the following nonlinear model to a dataset:
$$f(x)=\frac{C_1 \cdot a}{a^2 + C_2 \cdot x^2}$$
$a$: fitting parameter
$C_1, C_2$: Given constants
I can't apply standard LS-fitting as the problem is nonlinear and I don't see a way to convert it to a linear problem.
• What I have tried so far:
My approach is to formulate a minimization problem:
$$a=argmin(\sum_{i=0}^N abs(f_i-f(x_i))^2 )$$
Where $f(x_i)$ is my fitting function evaluated at point $x_i$ and $f_i$ is the corresponding function value from the dataset.
Then I'd apply an iterative method such as Gradient/Newton descent to find $a$.
Alternatively I'd apply RANSAC to fit the model but both methods are iterative and probably too slow.
Is there a way I could linearize this model or a different method I could apply?
Transform the data by letting $x'=x^2$, $g(x')=\frac{1}{f(x')}$. Then you have $$g(x')=\frac{a}{C_1}+\frac{C_2}{C_1a}x'$$ Let $m=\frac{a}{C_1}$ and $b=\frac{C_2}{C_1a}$ and you have a standard linear regression with parameters m and b.