Bound on 1st Order Taylor Expansion

219 Views Asked by At

Consider a differentiable function $f$ with the property that for any points $x,y$, we know that $\frac{|f'(x)-f'(y)|}{|x-y|} \leq C.$ How do I show that the following inequality holds for this function: $$f(x) - f(y) - f'(y)(x-y) - \frac{C}{2}(x-y)^2 \leq 0$$ Note the similarities to first order Taylor expansion around $y$ which is: $f(y) + f'(y)(x-y)$

1

There are 1 best solutions below

0
On BEST ANSWER

I'll write this out here for clarity:

From the Lagrange form of Taylor's theorem, we have that

$$f(x) = f(y) + f'(y) (x-y) + \frac{f''(\xi)}{2}(x-y)^2$$ for some $\xi \in (x, y)$. So if we can show that $f''(\xi) \leq C$ for any $\xi \in (x, y)$, we are done.

Using the inequality given in the question, since it holds for any $x$ and $y$, simply take the limit as $y \to x$ and observe that $|f''(x)| \leq C$ for all $x$, so $f''(x) \leq C$, as required.