How can one measure how well is a function approximated over an interval?

569 Views Asked by At

I am currently studying Taylor polynomials. I was wondering if the fact that a (continuous) function approximates another over an interval can be quantified?

At a point it is easy, you just compute $|f(a)-g(a)|$ and this tells you how the functions are close at that point.

However, what if I want to determine how well does a function approximate another over an interval?

One possible approach would be to take the function $f(x)-g(x)$ and compute its average value over the interval and take the absolute value. The lower the number, the better the approximation.

Is this approach viable? If not what is the standard approach to this?

2

There are 2 best solutions below

0
On BEST ANSWER

Perhaps the most popular option on the interval $[a,\,b]$ with $a<b$ is $\int_a^b (f-g)^2 dx$.

0
On

There are a number of different measures meant for different purposes.

One particularly useful measure is the sup norm: given a function $h$, we set

$$ \| h \|_\infty = \sup_x |h(x)| $$

(if $|h|$ has a maximum value, then $\| h \|_\infty$ is simply that maximum value)

So, you can measure the difference between the functions to be $\| f-g \|_\infty$.

One reason this measure is useful because uniform convergence is a particularly important notion for doing calculus with functions. A sequence of approximations $f_n$ converges to $f$ uniformly if and only if $\| f_n - f \|_\infty \to 0$ as $n \to \infty$.

Another reason it's useful is because many applications boil down to "I want to be sure the error of this larger calculation is less than some bound $\epsilon$".

Under this goal, when you use approximations in the course of the calculation, it doesn't matter if your approximation is completely perfect on most inputs if it's really bad on other inputs (unless you can afford to spend the effort to figure out which case you're in). What matters is the largest amount of error that your approximation can introduce, which is precisely the distance given by sup norm.


Incidentally, the reason for the $\infty$ subscript is by analogy with a family of other norms. E.g. on $\mathbb{R}^2$, we can define

$$ \| (x,y) \|_p = \sqrt[p]{x^p + y^p} $$

The sup norm is the limit of this as $p \to \infty$.