I am currently studying Taylor polynomials. I was wondering if the fact that a (continuous) function approximates another over an interval can be quantified?
At a point it is easy, you just compute $|f(a)-g(a)|$ and this tells you how the functions are close at that point.
However, what if I want to determine how well does a function approximate another over an interval?
One possible approach would be to take the function $f(x)-g(x)$ and compute its average value over the interval and take the absolute value. The lower the number, the better the approximation.
Is this approach viable? If not what is the standard approach to this?
Perhaps the most popular option on the interval $[a,\,b]$ with $a<b$ is $\int_a^b (f-g)^2 dx$.