Numerical Approximation of the derivative

65 Views Asked by At

In high school students are typical presented with the one sided definition of the derivative. Let $\epsilon > 0 $ and $f(x)$ be a mapping between two vector spaces then by definition $$f'(x) = \lim_{\epsilon \to 0} \frac{f(x + \epsilon) - f(x)}{\epsilon}$$

Another definition is given as $$f'(x) = \lim_{\epsilon \to 0} \frac{f(x + \epsilon) - f(x - \epsilon)}{2\epsilon}$$ From a theoretical point of view these two are the same since $\epsilon$ truly approaches zero. However, from a practical point of view in the world of numerical computation we can only approximate the limit. It turns out that the second definition yiels an error of $O(\epsilon^2)$ whereas the first one yields an error of $O(\epsilon)$. Why is this the case? Is there a known proof of this?

1

There are 1 best solutions below

0
On BEST ANSWER

Recall Taylor expansion: for $f \in C^3$, you have:

  • $f(x+h) = f(x) + f'(x)h + \dfrac{1}{2} f''(x) h^2 + O(h^3)$
  • $f(x-h) = f(x) - f'(x)h + \dfrac{1}{2} f''(x) h^2 + O(h^3)$

Then with appropriate differences, you have:

  • $\dfrac{f(x+h) - f(x)}{h} - f'(x) = \dfrac{1}{2} f''(x) h + O(h^2) = O(h)$
  • $\dfrac{f(x+h) - f(x-h)}{2h} - f'(x) = O(h^2)$