Proof involving taylor series and finding a point that satisfies an inequality

17 Views Asked by At

$f(x)$ is a twice differentiable function in [a,b].

m is the minimum of f(x) in [a,b], and M is the maximum.

Assume $f(a) \neq m$ and $f(b) \neq m$

Prove there exists c in [a,b] such that:

$|f''(c)| \geq \frac{2(M-m)}{(b-a)^2}$

I tried to use the Taylor theorem on a few 'interesting' points in [a,b]. One thing in particular that seemed like a good direction was writing it in lagrange remainder form in point a:

$f(x) = f(a) + f'(a)(x-a) + \frac{1}{2} f''(c)(b-a)^2$

However, I couldn't get rid of the derivative term which got me stuck. I also tried to develop the function around the minimum point (which is somewhere in [a,b]), but this also didn't turn out algebraically fruitful.

I'd appreciate a hint or a direction for this.