Background:
This is from Mark Joshi's book on mathematical finance chapter 7 exercise 9.
Question:
Suppose $f$ is twice-differentiable and $f^{\prime\prime}(x)$ is non-zero. Show that $$ \lim_{h\to 0^{+}}\frac{f(x+h) - f(x-h)}{2h}$$ converges to $f'(x)$ faster than $$ \lim_{h\to 0^{+}}\frac{f(x+h) - f(x)}{h}$$
The book suggests to take a Taylor expansion and note that in the symmetric case the first term cancels, but it is not clear to me how this would show a faster rate of convergence for one limit as opposed to another. Any suggestions are greatly appreciated.
Taylor expansion yields $$ f(x \pm h) = f(x) \pm hf'(x) + \frac{h^2}{2}f''(x) + \Theta(h^3) $$ hence $$ \frac{f(x+h)-f(x)}{h} = f'(x) + \frac{h}{2}f''(x) + \Theta(h^2) $$ and $$ \frac{f(x+h)-f(x-h)}{2h} = f'(x) + \Theta(h^2) $$ Can you finish the argument?
UPDATE
You see that the error in the one-sided expansion is of linear order because of the $f''$ term, but in the two-sided expansion, the approximation error is of order $h^2$, better by an order.