I've been set this problem recently and I'm having a lot of trouble with it. Any help would be much appreciated!
Let $f:\mathbb{R} \rightarrow \mathbb{R}$ be a function with continuous derivatives of all orders and suppose that, for some $x\in \mathbb{R}$ the derivative $f'(x)$ is non-zero. Thus there exists an interval D containing x such that $f'(y)\neq 0$ for all $y\in D$. Define $F:D \rightarrow \mathbb{R}$ by $F(y)=y - \frac{f(y)-f(x)}{f'(y)}$. Show that F is a Lipschitz function, with Lipschitz constant less than 1.
N.B. I think I can prove the Lipschitz part. Just use the Mean Value Theorem?
Since $f$ has continuous derivatives of all orders, so does $F$. So by the mean value theorem, showing that $F$ is Lipschitz with constant at most one is equivalent to showing the maximum value that $|F'(y)|$ can take is one. Now I get $$F'(y) = \frac{f''(y)[f(y)-f(x)]}{f'(y)^2}$$ I can see how to find an interval $E \subset D$ containing $x$ in which $\sup_{y\in E} |F'(y)| < 1$, simply by using the continuity of $f$, $f'$ and $f''$. But I think in general, it will not be true that $\sup_{y\in D} |F'(y)|$ can be easily bounded, because how can you control $f''(y)$?