Linear interpolation optimization

634 Views Asked by At

I want to interpolate any function $f(x)$ using only linear interpolation. So far I have found that the following equations do the trick pretty well. $$m(a,b,x)=\frac {f(b)-f(a)}{b-a}(x-a)+f(a)$$ $$L(a,b,x)=(a-x)(x-b)$$ $$T(x)=\frac 12(1+\mathrm {sign} (x))$$ $$F(x)=\sum_{n=1}^\infty m(v+(n-1)h,v+nh,x)T(L(v+(n-1)h,v+nh,x))$$ Where $h$ is the size of the intervals on which the function is interpolated, it can easily be stated that $F(x) \to f(x)$ as $h \to 0$. The error between both interpolations can be found with: $$R=f(x)-F(x)$$ However, I was thinking that maybe there is a function of $R$ that could maximize $h$ within an interval $[a,b]$, i.e. given the error $R$, what is the biggest $h$ that could yield that $R$ within an interval $[a,b]$?