My task is to find the closed form solution $ \boldsymbol w^* $ that minimizes E(W) and hence find $y(x, \boldsymbol w^*)$
Consider the following error function
$E(\boldsymbol w) = \frac{1}{2} \sum\limits_{n=1}^N {(y(x_n,\boldsymbol w)−t_n)^2} $
where w is a vector of weights; $x_n$ and $t_n$ come from two vectors of length N; and y is a polynomial:
y(x,w) = $ \sum\limits_{j=0}^M {w_jx^j} $
My task is to find the closed form solution $ \boldsymbol w^* $ to minimize E(W) and hence find $y(x, \boldsymbol w^*)$
So, I did $\frac{\partial E(w)}{\partial w}$ = 0 and I obtained :
$ \sum\limits_{j=0}^M A_{ij} w_j = T_i$
where, $A_{ij} = \sum\limits_{n=1}^N (x_n)^{i+j}$ and $T_i = \sum\limits_{n=1}^N (x_n)^i t_n $
But I'm not really sure how to solve for $w$ and find $y(x, \boldsymbol w^*)$
Any suggestions on how I can proceed?
Collect the powers of $x$ into a Vandermonde matrix
$$\eqalign{ V_{ij} &= (x_i)^{\,j} \cr y(x_n,w) &= \sum_{j=0}^M w_jV_{nj} \cr }$$ The last equation has a very simple form in matrix notation $$\eqalign{ y &= Vw\cr }$$ Utilizing the trace/Frobenius product, $$A:B={\rm tr}(A^TB)$$ allows the error function to be written in a form which makes the differential and gradient easy to calculate.
$$\eqalign{ E &= \frac{1}{2}\,(y-t):(y-t) \cr dE &= (y-t):dy = (y-t):V\,dw = V^T(y-t):dw \cr \frac{\partial E}{\partial w} &= V^T(y-t) = V^T(Vw-t) \cr }$$ Now set the gradient to zero and solve $$\eqalign{ V^TVw &= V^Tt \cr w^* &= (V^TV)^{-1}V^Tt \cr }$$