Given data points $(x, y)$, how do I find the values for $a$, $b$ and $c$ in $y = ab^x + c$?

61 Views Asked by At

If the data is assumed to follow the equation $y = ab^x$, I can just take $\log$ on both sides:

$$\log y = x \log b + \log a$$

and fit a linear regression, $\log y$ against $x$, where $\log b$ is the gradient and $\log a$ is the intercept.

However, for the case of $y = ab^x + c$, is there a closed form solution or any iterative method of computing for the values of $a, b, c$?

1

There are 1 best solutions below

0
On

This simple problem is interesting because there are two different approaches.

Remember that what you want is to minimize $$\text{SSQ}=\sum_{i=1}^n \big(a\,b^{x_i}+c-y_i\big)^2$$ and nothing else since what is measured is $y$ and not any of its possible transforms.

  • First approach

As @Doug suggested, give $c$ a value and linearize the model as $$z=\alpha + \beta x$$ where $$z_i=\log(y_i-c) \qquad \alpha=\log(a) \qquad \beta=\log(b)$$ The linear regression gives immediately $(\alpha , \beta)$ which are implicit functions of $c$.

Now, $\color{red}{recompute}$ $$y_i^{\text{calc}}=c+e^{\alpha + \beta x_i}$$ and the corresponding value of $\text{SSQ}$. Vary $c$ until you see a minimum. If you cannot use nonlinear regression tools, zoom more and more.

  • Second approach

The model being nonlinear because of $b$, give it a value and write the model as $$y=a t+c \qquad \text{where} \qquad t_i=b^{x_i}$$ Here again, the linear regression gives immediately $(a,c)$ which are implicit functions of $b$. Do as before zooming more and more.

Inmy humble opinion, the advantage of this approach is that it requires less work. Using the normal equations, we have the functions solving explicitely for $(a,c)$ $$\sum_{i=1}^n y_i=a\sum_{i=1}^n t_i+n\,c$$ $$\sum_{i=1}^n t_i\,y_i=a\sum_{i=1}^n t_i^2+c \sum_{i=1}^n t_i$$

Now, we need to solve for $b$ the equation $$\frac{\partial \text{SSQ}}{\partial b}=0$$ which is $$\sum _{i=1}^n x_i\, b^{x_i} \big(a(b)\,\, b^{x_i}+c(b)-y_i\big)=0$$ which can be easily solve using Newton method with numerical derivatives.