How to find more than two coefficient for single variable nonlinear equation?

562 Views Asked by At

I don't have good knowledge on mathematics, but now I faced one problem with maths. That is, I have a data set which contains only one independent and one dependent variable. Now I have a equation which has 3 unknown parameters.

Here I have two questions:

  1. If I have only one independent variable, how to find 3 unknown parameters.
  2. If I have more than one independent variables, How can I find corresponding parameters?

http://www.stat.colostate.edu/regression_book/chapter9.pdf

In this link, I am struggling at equation no: 9.2.3

Here unknown parameters are $\beta_1$, $\beta_2$ and $\beta_3$.

Please help me to find out these three unknown parameters in single variable as well as multiple variable.

Thanks in advance.

2

There are 2 best solutions below

0
On BEST ANSWER

For the first question, the model is $$y=\frac{1}{a+b~x^c}$$ which is nonlinear with respect to all parameters. I suppose that you have $n$ data points$(x_i,y_i)$ based on which you want to adjust parameters $a,b,c$ and, as usual, the problem with nonlinear regression is to start with reasonable values.

For the time, generate values $z_i=\frac 1 {y_i}$ which now makes the model $$z={a+b~x^c}$$ which is more pleasant. Suppose that $c$ is fixed; so, a linear regression $z=a+b~t$ (with $t_i=x_i^c$) gives immediately parameters $a$ and $b$ as well as the corresponding sum of squares $SSQ$. So, the sum of squares is just a function of $c$ and you can graphically search for an approximate minimum of it. So try a few discrete values of $c$ and try to locate approximately the minimum. For this value of $c$, you also have the corresponding $a,b$.

Now you are ready for starting the nonlinear regression for the original model.

0
On

Your equation (9.2.3) isn't meant to be "solved". It is simply an example of crop yield $Y$ at time $X$.

The regression method, which hasn't been discussed yet at that point in your text, attempts to find values for the parameters $\beta_i$ which minimize the error over the entire set of sample points $(x_k,y_k)$. It does not attempt to find parameters such that $y_k= \mu_Y(x_k)$ exactly for all pairs $(x_k,y_k)$ -- this would typically be impossible since the set of sample points is usually quite large compared to the number of parameters. Instead, regression tries to find the best possible fit given the data, in the sense of minimizing the error through a good choice of parameters.

To find out how to do this, continue reading your text.

The multivariable case isn't any different in theory.