I have a question here that I do not know how to solve.
If $f$ is a continuous function on $[a,b]$, and there exists a constant $M$ such that for every $a \leq x < b$, there is a decreasing sequence $a_x (n )$ which converges to $x$ such that $\dfrac{f(a_x(n))-f(x)}{a_x(n)-x}$ converge to a point in $[-M,M]$.
Show that $f$ satisfies the Lipschitz condition.
The problem is that I cannot find ways to transform the given 'sequence' property to the 'property on the line'. Uniformly continuous can be used, but it seems to weak compared with conditions like Lipschitz which requires you to say sth. about $\dfrac{f(x)-f(y)}{x-y}$. So I also wonder if the problem misses some conditions.
Can anyone give some suggestions?
If $f$ were differentiable then we could simply use the mean value theorem: For $a \le x < y \le b$ there is a $c \in (x,y)$ such that $$ \frac{f(y)-f(x)}{y-x} = f'(c) = \lim_{n \to \infty }\frac{f(a_c(n))-f(c)}{a_c(n)-c} \in [-M, M] $$ and we are done.
But $f$ is only assumed to be continuous. The desired conclusion can still be proven by mimicking the proof of the mean value theorem (and Rolle's theorem, on which it is based).
Let $a \le x < y \le b$. We define the function $g: [x, y] \to \Bbb R$ as $$ g(t) = f(t) - \frac{f(y)-f(x)}{y-x} (t-x) \,. $$ Note that $g(x) = g(y)$, so that $g$ assumes its minimum at some point $c \in [x, y)$. It follows that for all $t \in (c, y)$ $$ 0 \le g(t) -g(c) = f(t) - f(c) - \frac{f(y)-f(x)}{y-x}(t-c) \\ \implies \frac{f(y)-f(x)}{y-x} \le \frac{f(t)-f(c)}{t-c} $$ and by setting $t = a_c(n)$ we conclude that $$ \frac{f(y)-f(x)}{y-x} \le \lim_{n \to \infty } \frac{f(a_c(n))-f(c)}{a_c(n)-c} \le M \,. $$ Similarly, by considering the point where $g$ attains its maximum, we get that $$ \frac{f(y)-f(x)}{y-x} \ge -M $$ and that finishes the proof.