Theoretically, how should I compare the precisions of the following two approximations?
$f(x_2) = f(x_1) + (x_2-x_1)\times df(x_1)$
$f(x_2) = f(x_1) + (x_2-x_1)\times\frac{[df(x_1)+df(x_2)]}{2}$
where $df$ is the first-order derivative of $f$.
Any help is appreciated.
Thanks!