Let $x=(x_1,...,x_n)$ and $y=(y_1,...,y_n)$ write a code the finds $a,b,c$ the solves $y_i=ax^2_i+bx_i+c$ such that $\int_{0}^1(ax^2+bx+c)dx=0$
So we need to build Ax=b and then use x=pinv(A)b
$\int_{0}^1(ax^2+bx+c)dx=\frac{a}{3}+\frac{b}{2}+c=0$
So $c=-\frac{a}{3}-\frac{b}{2}$
$\begin{pmatrix} 1 & x_i & x_i^2 \\ \vdots & \vdots & \vdots\\ 1 & x_i & x_i^2 \end{pmatrix}\begin{pmatrix} a \\ b \\ -\frac{a}{3}-\frac{b}{2} \end{pmatrix}=b$
what should I put in $b$?
How to do linear least squares fitting.
To fit a linear sum of $m$ functions $f_k(x), k=1$ to $m$ to $n$ points $(x_i, y_i), i=1$ to $n$, we want to find the $a_k, k=1$ to $m$ so that $\sum_{k=1}^m a_kf_k(x) $ best fits the data.
Let $S =\sum_{i=1}^n(y_i-\sum_{k=1}^m a_kf_k(x_i))^2$.
$\begin{array}\\ \dfrac{\partial S}{\partial a_j} &=D_jS\\ &=D_j\sum_{i=1}^n(y_i-\sum_{k=1}^m a_kf_k(x_i))^2\\ &=\sum_{i=1}^nD_j(y_i-\sum_{k=1}^m a_kf_k(x_i))^2\\ &=\sum_{i=1}^n2(y_i-\sum_{k=1}^m a_kf_k(x_i))D_j(y_i-\sum_{k=1}^m a_kf_k(x_i))\\ &=\sum_{i=1}^n2(y_i-\sum_{k=1}^m a_kf_k(x_i))(-D_j a_jf_j(x_i))\\ &=\sum_{i=1}^n2(y_i-\sum_{k=1}^m a_kf_k(x_i))(- f_j(x_i))\\ &=-2\sum_{i=1}^nf_j(x_i)(y_i-\sum_{k=1}^m a_kf_k(x_i))\\ &=-2\left(\sum_{i=1}^ny_if_j(x_i)-\sum_{i=1}^nf_j(x_i)\sum_{k=1}^m a_kf_k(x_i)\right)\\ &=-2\left(\sum_{i=1}^ny_if_j(x_i)-\sum_{k=1}^m a_k\sum_{i=1}^nf_j(x_i)f_k(x_i)\right)\\ \end{array} $
Therefore, if $D_jS = 0$, then $\sum_{i=1}^ny_if_j(x_i) =\sum_{k=1}^m a_k\sum_{i=1}^nf_j(x_i)f_k(x_i) $.
Doing this for $j=1$ to $m$ gives $m$ equations in the $m$ unknowns $a_1, ..., a_m$.
Example: To fit a polynomial of degree $m-1$, let $f_j(x) = x^{j-1}$. The equations are then
$\begin{array}\\ \sum_{i=1}^ny_ix_i^{j-1} &=\sum_{k=1}^m a_k\sum_{i=1}^nx_i^{j-1}x_i^{k-1}\\ &=\sum_{k=1}^m a_k\sum_{i=1}^nx_i^{k+j-2}\\ \end{array} $
For a line, $m=2$ and the equations are, for $j = 1, 2$,
$\begin{array}\\ \sum_{i=1}^ny_ix_i^{j-1} &=\sum_{k=1}^2 a_k\sum_{i=1}^nx_i^{k+j-2}\\ &= a_1\sum_{i=1}^nx_i^{j-1}+a_2\sum_{i=1}^nx_i^{j}\\ \end{array} $
Explicitly these are
$j=1:\sum_{i=1}^ny_i = a_1n+a_2\sum_{i=1}^nx_i\\ j=2:\sum_{i=1}^nx_iy_i = a_1\sum_{i=1}^nx_i+a_2\sum_{i=1}^nx_i^{2}\\ $
These should look familiar.
For a quadratic, $m=3$ and the equations are, for $j = 1, 2, 3$,
$\begin{array}\\ \sum_{i=1}^ny_ix_i^{j-1} &=\sum_{k=1}^3 a_k\sum_{i=1}^nx_i^{k+j-2}\\ &= a_1\sum_{i=1}^nx_i^{j-1}+a_2\sum_{i=1}^nx_i^{j}+a_3\sum_{i=1}^nx_i^{j+1}\\ \end{array} $
Example 2. To fit a line through the origin, $y = ax$, $m=1$ and $f_1(x) = x$. The equation is then $\sum_{i=1}^ny_ix_i =a_1\sum_{i=1}^nx_i^2 $ so the result is $a =\dfrac{\sum_{i=1}^nx_iy_i}{\sum_{i=1}^nx_i^2} $.