using mean value theorem to approximate a curve?

267 Views Asked by At

If I know $f(a)$, $f'(a)$, $f''(a)$ and so on. I can conclude easily $f(x)$ is approximately $f(a) + (x - a) f'(a)$, but how do I further use $f''(a)$ to calculate a better approximation of the curve? I can't seem to find a way to use the second derivative.

Yes I know Taylor's series, but the question is how do I get that using the mean value theorem? Most importantly I want to learn how to apply the mean value theorem, and use it in this situation.