Least square method to fit data according to $g(x) = a - x + bx^2$, where $a, b \in \mathbb{R}$

33 Views Asked by At

If I have a set of points $(x_i, f(x_i)) \in \mathbb{R^2} $ and I want to use the method of the the least square fitting to calculate the free parameters $a$ and $b$ of the function $$ g(x) = a - x + bx^2 $$

By looking for the function $g$, one possible basis of function would be: $$ \phi_1(x) = 1, \phi_2(x) = x^2 $$ And the missing function of $g$ would cause a translation of the linear space of the solutions. Is this correct?

I don't know to proceed with this exercise. Could anyone give me a hint? Thanks in advance.

2

There are 2 best solutions below

1
On BEST ANSWER

$$h(x)=g(x)+x$$ First fit the function $h(x)$ for $a,b$ : $$h(x)=a+bx^2$$ The LINEAR Least Mean Square Regression leads to approximate $a$ and $b$ : $$\left(\begin{matrix} a \\ b \end{matrix}\right) \simeq \left(\begin{matrix} n & \sum_{i=1} ^{i=n} x_i^2 \\ \sum_{i=1} ^{i=n} x_i^2 & \sum_{i=1} ^{i=n} x_i^4 \end{matrix}\right)^{-1} \left(\begin{matrix} \sum_{i=1} ^{i=n} h_i \\ \sum_{i=1} ^{i=n} h_ix_i^2 \end{matrix}\right)\quad \text{with}\quad h_i=f_i(x_i)+x_i$$

0
On

You have $n$ data points $(x_i,y_i)$ and you want to minimize $$\text{SSQ}=\sum_{i=1}^n \big(a-x_i+b x_i^2-y_i\big)^2$$ Start defining two new variables$$z_i=y_i+x_i\qquad \text{and} \qquad t_i=x_i^2$$ which make $$\text{SSQ}=\sum_{i=1}^n \big(a+b\, t_i-z_i\big)^2$$ Setting the partial derivatives $\frac{\partial \text{ SSQ}}{\partial a}$ and $\frac{\partial \text{ SSQ}}{\partial b}$ equal to $0$, we have to solve the normal equations $$n \,a +b \sum_{i=1}^n t_i=\sum_{i=1}^n z_i$$ $$a\sum_{i=1}^n t_i+b\sum_{i=1}^n t_i^2=\sum_{i=1}^n t_iz_i$$ that I shall rewrite as $$n a +b S_t=S_z$$ $$a S_t+b S_{tt}=S_{tz}$$ and the exact solutions are $$a=\frac{S_{tt}\, S_z-S_t \,S_{tz}}{n \,S_{tt}-S_t^2}\qquad \text{and} \qquad b=\frac{n\, S_{tz}-S_t\, S_z}{n \,S_{tt}-S_t^2}$$