Confused trying to follow least squares fitting polynomial

80 Views Asked by At

I am trying to follow the steps here but am lost on how to proceed from step 9 to actually find the values of $a_0,a_1,a_2$ I don't see what step 10 has to do with step 9

Screen1 Screen2

1

There are 1 best solutions below

8
On BEST ANSWER

Equation (10) doesn't follow from (9). Notice that (9) and (12) are identical. The article is just pointing out that you can differentiate the residual with respect to the coefficients of the interpolating polynomial to obtain (9)/(12); or you can use the Vandermonde matrix equation (10) to explicitly derive (9)/(12).

Edit: Suppose we're given the data

\begin{align*} (x_{1},y_{1}) &= (-2,-2)\\ (x_{2},y_{2}) &= (-1,-2)\\ (x_{3},y_{3}) &= (1,1)\\ (x_{4},y_{4}) &= (2,3) \end{align*}

and suppose we want to find the best quadratic polynomial fit to these points in the $2$-norm sense, i.e.,

\begin{equation*} y = a_{0} + a_{1}x + a_{2}x^{2}. \end{equation*}

Then we have

\begin{equation*} X = \left(\begin{array}{ccc} 1 & x_{1} & x_{1}^{2}\\ 1 & x_{2} & x_{2}^{2}\\ 1 & x_{3} & x_{3}^{2}\\ 1 & x_{4} & x_{4}^{2} \end{array}\right) = \left(\begin{array}{ccc} 1 & -2 & 4\\ 1 & -1 & 1\\ 1 & 1 & 1\\ 1 & 2 & 4 \end{array}\right) \end{equation*}

and

\begin{equation*} \mathbf{y} = \left(\begin{array}{c} y_{1}\\ y_{2}\\ y_{3}\\ y_{4} \end{array}\right) = \left(\begin{array}{c} -2\\ -2\\ 1\\ 3 \end{array}\right). \end{equation*}

Then we have

\begin{equation*} X^{T}X = \left(\begin{array}{ccc} 4 & 0 & 10\\ 0 & 10 & 0\\ 10 & 0 & 34 \end{array}\right) \end{equation*}

so

\begin{equation*} \left(X^{T}X\right)^{-1} = \left(\begin{array}{ccc} 17/18 & 0 & -5/18\\ 0 & 1/10 & 0\\ -5/18 & 0 & 1/9 \end{array}\right). \end{equation*}

Then we have

\begin{equation*} \left(X^{T}X\right)^{-1}X^{T} = \left(\begin{array}{ccc} 17/18 & 0 & -5/18\\ 0 & 1/10 & 0\\ -5/18 & 0 & 1/9 \end{array}\right)\left(\begin{array}{cccc} 1 & 1 & 1 & 1\\ -2 & -1 & 1 & 2\\ 4 & 1 & 1 & 4 \end{array}\right) = \left(\begin{array}{cccc} -1/6 & 2/3 & 2/3 & -1/6\\ -1/5 & -1/10 & 1/10 & 1/5\\ 1/6 & -1/6 & -1/6 & 1/6 \end{array}\right) \end{equation*}

and finally,

\begin{equation*} \mathbf{a} = \left(\begin{array}{c} a_{0}\\ a_{1}\\ a_{2} \end{array}\right) = \left(X^{T}X\right)^{-1}X^{T}\mathbf{y} = \left(\begin{array}{cccc} -1/6 & 2/3 & 2/3 & -1/6\\ -1/5 & -1/10 & 1/10 & 1/5\\ 1/6 & -1/6 & -1/6 & 1/6 \end{array}\right)\left(\begin{array}{c} -2\\ -2\\ 1\\ 3 \end{array}\right) = \left(\begin{array}{c} -5/6\\ 13/10\\ 1/3 \end{array}\right). \end{equation*}

Then the least-squares quadratic polynomial fit to the data is

\begin{equation*} y = -\frac{5}{6} + \frac{13}{10}x + \frac{1}{3}x^{2}. \end{equation*}

Here is a plot of the data along with the quadratic least-squares fit: quadraticLeastSquaresFit

It's pretty close! That's the beauty of the least-squares approximation. ;)