least-squares estimation

275 Views Asked by At

I want to write a computer program that findes the least-squares estimates of the coefficients in the following models:

1) $y = ax^2+bx+c$

2) $y= ax^n$

can you help me what should I do?I don't have any data of these models.

1

There are 1 best solutions below

3
On

(1) is linear in the parameters ($a, b, c$). If your data points are $(x_i, y_i)_{i=1}^m$, what you want to do is minimize the sum $S =\sum_{i=1}^m (y_i-(ax_i^2+bx_i+c))^2 $.

For each parameter (for example, $a$), differentiate $S$ with respect to $a$. This will get an equation.

In the example,

$\begin{array}\\ \frac{\partial S}{\partial a} &=\sum_{i=1}^m \frac{\partial }{\partial a}(y_i-(ax_i^2+bx_i+c))^2\\ &=\sum_{i=1}^m 2(y_i-(ax_i^2+bx_i+c))\frac{\partial }{\partial a}(y_i-(ax_i^2+bx_i+c))\\ &=\sum_{i=1}^m 2(y_i-(ax_i^2+bx_i+c))(-x_i^2)\\ &=-2\sum_{i=1}^m x_i^2(y_i-(ax_i^2+bx_i+c))\\ &=-2\left(\sum_{i=1}^m x_i^2y_i-\sum_{i=1}^m x_i^2ax_i^2-\sum_{i=1}^m x_i^2bx_i-\sum_{i=1}^m x_i^2c\right)\\ &=-2\left(\sum_{i=1}^m x_i^2y_i-a\sum_{i=1}^m x_i^4-b\sum_{i=1}^m x_i^3-c\sum_{i=1}^m x_i^2\right)\\ \end{array} $

Setting this partial derivative to zero, we get $$\sum_{i=1}^m x_i^2y_i =a\sum_{i=1}^m x_i^4+b\sum_{i=1}^m x_i^3+c\sum_{i=1}^m x_i^2. $$

This is one of the three equations in $a, b, c$ that are needed.

Do the same with $\frac{\partial S}{\partial b} = 0 $ and $\frac{\partial S}{\partial c} = 0 $.

Then solve these three linear equations for $a, b, $ and $c$.

This is the least squares method.

(2) is linear in $a$ but nonlinear in $n$. You can make it linear in its parameters by writing $\ln(y) = \ln(a)+n \ln(x) $. Then the parameters are $\ln(a)$ and $n$.

Anyway, this is a start.

Linear least squares, like problem (1), are a lot easier than nonlinear least squares like problem (2).