Exponential extrapolation

2.6k Views Asked by At

Given a set of points on 2D surface $(x_1,y_1),(x_2,y_2),\ldots,(x_n,y_n)$ and a function $f(x)=k+ab^x$, the task is to find values of $k,a$ and $b$ that minimize the following sum:

$$\sum_{i=1}^n (k+ab^{x_i}-y_i)^2.$$

I tried to get derivatives (with respect to $k,a,b$) from this sum and got the system of three equations, but it's impossible for me to express unknowns from it. Also I'm looking for numerical answer.

Does anyone know how to do that? Thanks for the help.

3

There are 3 best solutions below

0
On BEST ANSWER

I used @Taro hint to solve this task.

I made the following change:

$$\ A=lna\\ B=lnb $$

And rewritten the function: $$\ ln(y_i-k)=A+x_iB $$

After this I've got a system with 2 unknowns $A$ and $B$ ($k$ is parameter):

$$\begin{cases} \sum_{i=1}^n ln(y_i-k)=nA+B\sum_{i=1}^nx_i \\ \sum_{i=1}^n ln(y_i-k)x_i=A\sum_{i=1}^nx_i+B\sum_{i=1}^nx_i^2 \end{cases}$$

As you can see right part of system is static - only left part depends on $k$.

And solved it with binary search (used $k$ as parameter).

Here's a result: enter image description here

2
On

Gauss-Newton algorithm directly deals with this type of problems. Given m data points $(x_i,y_i)$ for regression with a function of n parameters $\vec \beta =(\beta_1,...,\beta_n)$ $$min_{\vec \beta }\ S(\vec \beta)\ where\ S(\vec \beta)=\sum_{i=1}^m r_i(\vec \beta)^2=(y_i-f(\vec \beta,x_i))^2$$ I skip the derivation of algorithm which you can find in every textbook (First use Taylor approximation and then use Newton's method). $$\Delta \vec \beta=\big(J^T\ J\big)^{-1}\ J^T\ \vec r$$ $$\vec \beta=\vec \beta + \alpha\ \Delta \beta$$ where $\alpha$ is damping coefficient and $$J=\begin{pmatrix}\bigg(\frac{\partial f}{\partial \beta_1}\bigg)_{x=x_1}&...&\bigg(\frac{\partial f}{\partial \beta_n}\bigg)_{x=x_1}\\\ ...&...&...\\\ \bigg(\frac{\partial f}{\partial \beta_1}\bigg)_{x=x_m}&...&\bigg(\frac{\partial f}{\partial \beta_n}\bigg)_{x=x_m}\end{pmatrix}\quad \vec r=\begin{pmatrix}y_1-f(\vec \beta,x_1)\\\ ... \\\ y_m-f(\vec \beta,x_m) \end{pmatrix}$$ For your specific case $$\frac{\partial f}{\partial a}=b^{x_i}$$ $$\frac{\partial f}{\partial b}=a\ b^{x_i-1}\ x_i$$ $$\frac{\partial f}{\partial k}=1$$

1
On

If you don't want to have to think too much, you can chuck your problem into some Gauss-Newton or Levenberg-Marquardt routine and be done with it. If you're willing to do a bit of analytical work, here's a way that might make for a less computationally-intensive solution.

You can ease your problem by recognizing that your model is linear in the parameters $a$ and $k$. In particular, taking partial derivatives of your objective function with respect to those two parameters and equating to zero results in the following set of linear equations:

$$\begin{pmatrix}\sum_j^n b^{2x_j}&\sum_j^n b^{x_j}\\\sum_j^n b^{x_j}&n\end{pmatrix}\begin{pmatrix}a\\k\end{pmatrix}=\begin{pmatrix}\sum_j^n y_j b^{x_j}\\\sum_j^n y_j\end{pmatrix}$$

which is easily solved with Gaussian elimination or Cramer's rule.

Now, taking the partial derivative with respect to the nonlinear parameter $b$ and equating to $0$ gives

$$\sum_j^n \left(x_j b^{x_j}\left(a b^{x_j}+k-y_j\right)\right)=0$$

Substituting the values of $a$ and $k$ obtained from the linear equations now yields a nonlinear equation in the single unknown $b$, which can then be solved with the standard methods like Newton-Raphson or Brent. Of course, you need a good guess for $b$; that's the part of the problem you will need to reckon out for yourself. Once you've obtained a nice value for $b$, you can then substitute it into your explicit expressions for $a$ and $k$.