Reverse Lipschitz condition

1.1k Views Asked by At

Assume that $f:[a,b]\to\mathbb{R}_+$ is continuous, differentiable with f'(x)>0 and Lipschitz. For $y>x$, I want to understand if $$ f(y)-f(x) \ge L (y-x) $$ holds true where $L$ is a strictly positive constant.

The way I see it is that it is true by uniform continuity.

A similar question has been already asked, though this is more specific and with additional assumptions.

2

There are 2 best solutions below

2
On

That would entail $f'(x)\ge L>0$ by MVT. So for a counterexample, choose an $f'$ on the interval that is bounded, non-negative, and has zeros.

Simple example: $f'(x)=3x^2$ on $[-1,1]$ so that $f(x)=x^3$, say.

1
On

This is actually an interesting question. It is certainly true if $f$ is $C^1$ on $[a,b]$.

Suppose that $f'(x) > 0$ for all $x \in [a,b]$. By the compactness of $[a,b]$ and the continuity of $f'$, this implies $\alpha \stackrel{\rm def}{=}\inf_{x \in [a,b]} f'(x) > 0 $. Since for any distinct $x<y \in [a,b]$ we have that $\frac{f(y) - f(x)}{y-x} = f'(\lambda)$ for some $\lambda \in [a,b]$, it is easy to see that we also have $\inf_{x <y \in [a,b]} \frac{f(y) - f(x)}{y-x} \geq \alpha > 0$. Hence for all $x<y \in [a,b]$ we have that $f(y) - f(x) \geq \alpha(y-x)$.

However, if we drop the $C^1$ assumption, it's not true. Specifically, we can exhibit a differentiable (but not $C^1$) function $f:[0,1] \to \mathbb{R}$ such that the derivative $f'$ is bounded (hence $f$ is Lipschitz) and the derivative is strictly positive but nonetheless $\inf_{x<y \in [0,1]} \frac{f(y) - f(x)}{y-x} = 0$. The function here is $$f(x) = \int_{0}^{x} \left(\frac{\sin \frac{1}{t}}{1 + t^2} + 1\right) dt$$ where we regard $\sin \frac{1}{0} = 0$ (remark: for notational simplicity, we always assume that $\sin \frac{1}{0} = \cos \frac{1}{0} = 0$ for the rest of the discussion). By the first fundamental theorem of calculus, for $x \in (0,1]$ we have that $f'(x) = \frac{\sin \frac{1}{x}}{1 + x^2} + 1$. Moreover, for $x \in (0,1]$, $f'(x)$ is positive, for the simple fact that we have a lower bound $\frac{-1}{1+x^2} + 1 > 0$. Since the integrand exhibits a discontinuity at $x=0$, we must verify the differentiability of $f$ at $x=0$ manually. We argue $f'(0) = 1$. Observe that $$f'(0) = \lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{\sin \frac{1}{t}}{1 + t^2} + 1\right) dt = 1 + \lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{\sin \frac{1}{t}}{1 + t^2}\right) dt $$

The tricky bit is verifying $$\lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{\sin \frac{1}{t}}{1 + t^2}\right) dt = 0$$

There are two steps here. First, we argue that $$\lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left({\sin \frac{1}{t}}\right) dt = 0$$

We first show that there is a globally differentiable function $g:\mathbb{R} \to \mathbb{R}$ such that $g(0) = 0$ and $g'(x) = \sin \frac{1}{x}$ for all $x \in \mathbb{R}$. It is easy to verify that the function $g(x) = x^2\cos\left(\frac{1}{x}\right) - \int_{0}^{x} 2t \cos \frac{1}{t} dt$ satisfies these constraints. That $p(x) \stackrel{\rm def}{=} x^2\cos\left(\frac{1}{x}\right)$ is globally differentiable with $p'(x) = \sin \frac{1}{x} + 2x \cos \frac{1}{x}$ is a standard elementary analysis exercise, and that $h(x) \stackrel{\rm def}{=} \int_{0}^{x} 2t \cos \frac{1}{t} dt$ is globally differentiable with $h'(x) = 2x \cos \frac{1}{x}$ follows since the integrand is continuous, hence the FTC certainly applies. Hence the difference $g(x) = p(x) - h(x)$ is certainly differentiable with $g'(x) = \sin \frac{1}{x}$ and $g(0) = 0$. The fundamental theorem of calculus [footnote $\mathbf{1}$ for specific details] implies that $g(x) = \int_{0}^{x} \sin \frac{1}{t} dt$ for all $x \in \mathbb{R}$, and since $g'(0) = 0$, we have that $\lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left({\sin \frac{1}{t}}\right) dt = 0$ which is what we wanted.

The second step is this: $$\left|\lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{\sin \frac{1}{t}}{1 + t^2} - \sin \frac{1}{t}\right) dt\right| = \lim_{\epsilon \to 0^{+}} \left|\epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{-t^2 \sin \frac{1}{t}}{1+t^2}\right) dt \right| \leq \lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left|\frac{-t^2 \sin \frac{1}{t}}{1+t^2}\right| dt \leq \lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left|\frac{t^2}{1+t^2}\right| dt \leq \lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left|\frac{\epsilon^2}{1+0^2}\right| dt \leq \lim_{\epsilon \to 0^{+}} \epsilon^{-1} \epsilon^{3} = 0$$

This gives us that $$\lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{\sin \frac{1}{t}}{1 + t^2} - \sin \frac{1}{t}\right) dt = 0$$

Hence, by the linearity of the integral and the aforementioned results, we have that

$$\lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{\sin \frac{1}{t}}{1 + t^2}\right) dt = \lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left({\sin \frac{1}{t}}\right) dt + \lim_{\epsilon \to 0^{+}} \epsilon^{-1}\int_{0}^{\epsilon}\left(\frac{\sin \frac{1}{t}}{1 + t^2} - \sin \frac{1}{t}\right) dt = 0 + 0 = 0$$

Overall, this implies that the function $f:[0,1] \to \mathbb{R}$ we constructed a few paragraphs ago satisfies $f'(0) = 1$ and, as mentioned earlier, $f'(x) > 0$ for $x \in (0,1]$. Hence, indeed, we have that $f'(x)>0$ for all $x \in [0,1]$. We claim that $\inf_{x \in [0,1]} f'(x) = 0$. To prove this, simply consider that the sequence $\left\{f'\left(\frac{1}{\frac{1}{2}(4n-1)\pi}\right)\right\}_{n \geq 2} = \left\{\frac{-1}{1 + \left(\frac{1}{\frac{1}{2}(4n-1)\pi}\right)^2} + 1\right\}_{n \geq 2}$ is strictly decreasing and converges to $0$.

Since $\inf_{x \in [0,1]} f'(x) = 0$, we must also have that $\inf_{x<y \in [0,1]} \frac{f(y) - f(x)}{y-x} = 0$, since derivatives can be approximated to arbitrary precision by difference quotients. This proves the result, and indeed implies that no positive constant $L$ exists such that $f(y) - f(x) \geq L(y-x)$ for $y>x \in [a,b]$.

[$\mathbf{1}$]: The stronger version of the fundamental theorem of calculus (proven, for instance, in Spivak's text) tells us that if $f:\mathbb{R} \to \mathbb{R}$ is Riemann integrable on some compact $[a,b]$ and if $F'(t) = f(t)$ for all $t \in [a,b]$, then $F(b) - F(a) = \int_{a}^{b} f(t) \ dt$. This is stronger because $f$ is not required to be continuous, and this is important here. This is why the "singularity" of $\sin \frac{1}{t}$ at $t=0$ is irrelevant in this particular context. Since we know $\sin \frac{1}{t}$ is the derivative of some other function (namely, $g$) and it's also Riemann integrable on all compact intervals $[a,b] \subset \mathbb{R}$, we do not require continuity to deduce that $g(x) = g(x) - g(0) = \int_{0}^{x} \sin \frac{1}{t} dt$.