I have a function
$$f(x) = \left\{ \begin{array}{lr} 0 & 0 \leq x < 1/3\\ q(x) & 1/3 \leq x < 2/3 \\ 1 & 2/3\leq x \leq 1 \end{array} \right.\\ $$ where $q(x)$ is some analytic function that interpolates between the points $x=1/3, 2/3$ and matches the first derivative at these points (say a spline fit). I want to approximate this with a polynomial (as Weierstrauss' theorem says I can) and want to understand how the error falls off as I increase the degree of the approximating polynomial, $p_n(x)$, where the error is defined as $\epsilon_n = \sup|f(x)-p_n(x)|$ .
My initial idea was to choose set of orthogonal polynomials (say Legendre polynomials, $l_m(x)$) and then try express $f(x) \approx \sum_{m=1}^n a_m l_m(x)$. The error can then be bounded by the part of the series that has been cut off, and we should be able to bound this. However, $f(x)$ is not analytic and hence there I have no idea if extracting the $a_m$ is possible in the conventional way of
$$a_m = \int_0^1 f(x)l_m(x) dx $$.
is this is a valid method of extracting the coefficients $a_m$ given $f(x)$ is not analytic? If not, is there a way to go about it?
More generally, is there a better way of find out how the error scales with the degree of the approximating polynomial?
You can use Chebyshev series. Or if you want the best uniform approximation to a given degree, try the Remez algorithm.