In high school students are typical presented with the one sided definition of the derivative. Let $\epsilon > 0 $ and $f(x)$ be a mapping between two vector spaces then by definition $$f'(x) = \lim_{\epsilon \to 0} \frac{f(x + \epsilon) - f(x)}{\epsilon}$$
Another definition is given as $$f'(x) = \lim_{\epsilon \to 0} \frac{f(x + \epsilon) - f(x - \epsilon)}{2\epsilon}$$ From a theoretical point of view these two are the same since $\epsilon$ truly approaches zero. However, from a practical point of view in the world of numerical computation we can only approximate the limit. It turns out that the second definition yiels an error of $O(\epsilon^2)$ whereas the first one yields an error of $O(\epsilon)$. Why is this the case? Is there a known proof of this?
Recall Taylor expansion: for $f \in C^3$, you have:
Then with appropriate differences, you have: