Is there a theory for piecewise differentiable regression polynomials?

41 Views Asked by At

I have an interesting question, I would like to have answered...

I have a very noisy signal $f$, that I want to smoothen out. Using a global regression cannot work, as I don't have a model of the signal (being a measurement of the solution of an ODE). I would like to use multiple regressions $p_1$, $p_2$, etc. to have $$ \min_{p_1} \int_{x_0}^{x_1} (f(x)-p_1(x))^2 dx$$ and $$\min_{p_2} \int_{x_1}^{x_2} (f(x)-p_2(x))^2 dx $$ This however is very easy (use Matlab polyfit on each interval) and not a problem. But I want to go further... I want to enforce $p_1(x_1)=p_2(x_1)$ and $p_1'(x_1) = p_2'(x_1)$ to have a continous and smooth result. This will ofcourse increase the regression error.

Is there a theory for this? Or do I need to make my own algorithm via optimization toolboxes?