Derivative of vector function with summatory

38 Views Asked by At

I'm reading a book about Polynomial curve fitting and I'm trying to make some of the examples in python, but I'm stuck at the calculus part.

We have a set of $N$ training points ($x$ represents observations on $x$ axis with their corresponding observations of the values of $t$.

$ \mathbf x = (x_1,x_2,..x_N)^T$

$ \mathbf t = (t_1,t_2,..t_N)^T$

I'm going to fit the data using a polynomial function of the form

$y(x,\mathbf W) = w_0 + w_1x+w_2x^2+...+w_Mx^M = \sum_{j=0}^M w_jx^j $

The coefficients $w_0,...w_M$ are denoted by the vector $\mathbf W$.

To determine the values of the coefficients I'm going to minimize the error function: $ E(\mathbf W) = \frac{1}{2}\sum_{n=1}^N (y(x_n,\mathbf W)-t_n)^2 $

and to do that I have to calculate the gradient of $E(\mathbf W)$ and "take little steps on the opposite direction"

the problem is that I don't know how to calculate the partial derivative of a vector function that also includes a summatory, i.e: $\frac{\partial}{\mathbf W} E(\mathbf W)$

Any help would be appreciated.

Thank you.