Linearizing non-linear least squares: Problem with derivatives

98 Views Asked by At

We want to approximate $$y_i \approx a b^{x_i}$$ and thus have $$S=\sum_{i=1}^m (ab^{x_i}-y_i)^2$$ as least squares error term. This term is not linear in b, so it is not easy to calculate its derivatives.

So, there is another method that approximates the relative error and thus approximates $$\log y_i \approx \log a + x_i \log b$$ resulting in $$\tilde{S}=\sum_{i=1}^m (\log a +x_i\log b-\log y_i)^2$$ which is linear in $\log a$ and $\log b$ now.

I am now not sure how to solve the following: I now would like to find $a$, $b$ such that the above $\tilde{S}$ is minimized.

Do I now have to set $$\frac{\delta \tilde{S}}{\delta \log a} =0$$ or $$\frac{\delta \tilde{S}}{\delta a} =0$$ or is this the same? Would appreciate any help.