Bound remainder of Taylor series with Lipschitz property of derivative

3.7k Views Asked by At

I feel like this should be fairly simple: I would like to use the fact that

$$|g'(x) - g'(y)|\leq C|x-y|^\delta$$

for all $x,y\in \mathbb{R}$ for some $C,\delta >0$ to put a bound on the remainder $R$ of

$$ g(y) = g(x) + g'(x)(y-x) + R$$

So, does a Lipschitz condition on the derivative of a function help us to know how good our Taylor approximation is?

1

There are 1 best solutions below

0
On BEST ANSWER

Since $g'$ is continuous, we can write $g(y) = g(x) + g'(x)(y-x) + \int_0^1 (g'(x+t(y-x))-g'(x))(y-x)dt$, hence $\|R_x(y)\| \le C\int_0^1 t^\delta dt \|y-x\|^{\delta+1}= {C \over \delta+1} \|y-x\|^{\delta+1}$.