I have a simple function $x^n$ that I want to estimate in the interval of $[-1, 1]$ using polynomials of a lower degree. I understand this as having 3 possible kinds of best approximations:
- Least first power or L1 approximation.
- Least squares or L2 approximation.
- Chebyshev's or Minimax or $L_{\infty}$ approximation.
The goal is to make use of the 3rd kind of approximation to find the minimum max error that is obtained from some $$|f(x) - g(x)|$$
What I've done is:
Normalise all the Chebyshev's polynomial so that the leading coefficient is 1. This means we let $$\bar{T}_k(x) = T_k(x) / 2^{k-1}$$
Then we can find $x^n - \bar{T}_k(x)$ to have maximum degree of $n-2$ only.
But how do we use this property to economize a taylor series and estimate the error for one economization step?