Assume a function $f(n)$ which is defined for $n\in\mathbb{Z}$.
For each period $[n,n+1]$ the function could be interpolated with a polynomial of degree $m$. The polynomials should be built in a way that as many as possible derivatives of two neighbor segments are equal.
$f_m(n)$ is the $m$-th interpolation which means the used polynomials have degree $m$.
My question is now: Does this function converge? Or does this polynomial interpolation somewhen starts flipping and going crazy?
Converging in this sence: $$\forall \varepsilon\in\mathbb{R}:\exists n\in\mathbb{N}:\forall m>n:\int_{-\infty}^{\infty}|f_n(x)-f_m(x)|\mathrm{dx}<\varepsilon$$
Thank you very much
It would if the polynomial $f_{m+1}$ of interval $[n+1,n+2]$ happened to be The Taylor polynomial $T_{m+1}(x)$ of degree $m+1$ of $f$ and the polynomial $f_m$ of interval $[n,n+1]$ happened to be The Taylor polynomial $T_m(x)$ of degree $m$ of $f$. Cause in that case, the integrand would be bound above by the Taylor remainder $R_{n+1}$ on each interval, so from uniform convergence of the $f_n=T_n$:
$$\int\limits_{-\infty}^\infty |f_n(x)-f_m(x)|dx=\lim\limits_{n\to\infty}\int\limits_0^n |R_{n+1}(x)|dx\to 0$$
If the $f_m$ are not as above, there's no telling what can happen without further information on the distribution of $f$, because by construction, then:
$$\lim\limits_{x\to (n+1)^-}\frac{df_m}{dx}\neq \lim\limits_{x\to (n+1)^+}\frac{df_{m+1}}{dx}$$
which means $\exists x_n\in[n+1,n+2]$ and an $\varepsilon$ polynomially depending on $m$, such that:
$$|f_{m+1}(x_n)-f_m(x_n)|=\varepsilon(n^m)>0$$
They are at least $n$ such $x_n$ (one for each interval $[n,n+1]$), so the integral becomes:
$$\int_1^n |f_{m+1}(x_n)-f_m(x_n)|dx\ge C\cdot n\cdot \varepsilon(n^m)>0$$
which vanishes only if $f$ is a constant. Otherwise, it may not converge.