I have an exam coming up next week in my Applied Numerical Methods class. Our professor gave us a list of about 12 things that we need to be able to do for the exam, all of which are pretty straightforward. However, one of the subjects I had no recollection of learning. Since I missed a day since the last test, I emailed the professor and long story short we did not cover this in class and she told us to research it online.
Here is what the study guide says:
Write the system of equations to determine a set of curve fitting parameters (A, B, C, etc) for linear and quadratic least squares, and for simple nonlinear forms y = f(x; A, B). Be able to solve a 2 × 2 linear system.
All I can find online are videos of people doing curve fitting in MATLAB. I honestly don't know where to start here. Can someone please explain what I should be researching here in layman's (sp?) terms? Could I get an example of a 2x2 linear system?
I am not sure how much this can help, but let me share some ideas of curve fitting here with you.
Assume we have a set of $n$ data with abscissae $(t_i)_{i=0}^n$ and value $(y_i)_{i=0}^n$. Certainly we can find a polynomial with degree less than or equal to $n$ passing through all the point $(t_i, y_i)$ by means of interpolation, but the resultant polynomial may not be satisfactory. One reason is the polynomial can be highly oscillatory. (It may have up to $n-1$ turning points.) That is why we do curve fitting: giving up the exactness of the data, to find a smoother polynomial with lower degree to represent the data.
The least-squares method is a typical way to find such a polynomial of degree $m < n$. For a polynomial $p(t)$ with degree $m$, we define the error by $E[p] = \sum_{i=0}^n |y_i - p(t_i)|^2$. We remark that the error depends on the polynomial $p(t)$. The best fit polynomial $p_0(t)$ is the one which minimizes the error. It is equivalent to find the coefficients $\alpha_0, \alpha_1, \ldots, \alpha_m$ of the best-fit polynomial $p_0(t) = \sum_{i=0}^m \alpha_i t^i$.
For example, if we wish to find a linear fitting curve, what we have to find is the coefficients $\alpha_0$ and $\alpha_1$. The error is $E[p] = \sum_{i=0}^n |y_i - (a + b t_i)|^2$. To minimize the error, we would like to find the minimizer of $E[p]$ with respect to $a$ and $b$. Hence we are to find $\alpha$ and $\beta$ s.t. $$ \frac{\partial E[p]}{\partial a} |_{(a, b) = (\alpha, \beta)} = 0 $$ $$ \frac{\partial E[p]}{\partial b} |_{(a, b) = (\alpha, \beta)} = 0 $$ This is a 2-by-2 matrix system and the least square best-fit straight line is $\tilde{y}(t) = \alpha + \beta t$.
In general, we can do the least squares best fit by determining the $m+1$ coefficients with $n+1$ equations, namely, $$p_0(t_j) = \sum_{i=0}^m \alpha_i t_j^i = y_j$$ But in the assumption $m < n$, this Vandermonde matrix system is overdetermined. We can find the least-square solution by the pseudoinverse. Note that if $m = n$, the Vandermonde matrix system is consistent and the solution is just the usual interpolation.