As the title says, I am looking for concrete examples of functions which are badly approximatable by polynomials (i.e. a slow convergence rate), but well approximatable by B-splines (hopefully, irrespective of the choice of knots, but I guess that is not generally possible?). I am looking at the problem from a least-squares setting. That is, I have a random collection of samples of the function, and I am solving an overdetermined problem using either a B-spline basis, or a polynomial basis (e.g. Chebyshev).
thanks!
For smooth functions, e.g., differentiable functions, both spline and polynomial approximations give the same degree of approximation. For very smooth functions, e.g., analytic functions, polynomials are better. Polynomial approximation does not exhibit saturation like spline approximation. As far as finding these approximations using random collection of samples, the question needs to be made more precise. Assuming that the approximation is taking place on $[-1,1]$ and the samples are distributed properly, there are ways to construct both polynomial and spline approximations to yield asymptotically the same degree of approximation. For polynomial approximation, see my paper Ja'en J. of Approx., {\bf 1} (1) (2009), 1--25, for spline approximation, see my paper J. Math. Anal. and Appl. {\bf 251} (2000), 356-363. In both cases, it is not necessary to solve a system of equations. Spline approximation will usually work better if the function is equal to 0 on parts of the interval because we can manage the approximation to be 0 there as well for the most part. The same can be achieved with polynomials using my localized kernels, but of course, the approximation will not be identically zero on any interval. When the samples are not dense on the interval, we don't expect a good approximation on the whole interval, but see my paper Journal of Computational Physics, {\bf 249} (2013) 96--112 for a procedure we have called Minimum Sobolev Norm interpolation.