Understanding Lipschitz Continuity

33.2k Views Asked by At

I have heard of functions being Lipschitz Continuous several times in my classes yet I have never really seemed to understand exactly what this concept really is.

Here is the definition.

$\left | f(x_{1})-f(x_{2}) \right |\leq K\left | x_{1}-x_{2} \right |$

Here is the function I'm using. It is known that this is Lipschitz Continuous.

$f(x)=\sqrt{x^2+5}$

If you pick some points. Here I picked (1, 0.408) and (2, 0.66).

The result is:

$\left | 0.252 \right |\leq K\left | 1 \right |$

So as long as K is 0.252 or bigger then this function is Lipschitz Continuous?

What if I pick K to be 0.0001 is the function no longer Lipschitz Continuous?

To me this is hard to understand, why not always pick K to be very large such that the function is always Lipschitz Continuous?

Unless the left hand side of the inequality is infinity, can't you always find a K big enough to satisfy this inequality?

4

There are 4 best solutions below

2
On

You can not pick K sufficiently enough for a function to be Lipschitz continuous if they are not. That's the main point of that kind of continuity. If $f$ is not lipschitz continuous, and you say that $K = 10^6$, I can find an pair of points $x_1$ and $x_2$ such that $|f(x_1) - f(x_2)| \geq 10^6|x_1 - x_2|$.

Think about the mean value theorem and Lipschitz continuity.

Mean value theorem says if $f$ is continuous at [a,b] and differentiable at (a,b), then

$\exists c \in (a,b)$ such that $\displaystyle\frac{f(b) - f(a)}{b-a} = f'(c)$.

Lipschitz says that

$\exists K > 0, \forall a,b \in D_f,\mbox{ such that } \displaystyle\frac{|f(b) - f(a)|}{|b - a|} \leq K$.

Then if the derivative of $f$ as a function is bounded, then $f$ will be Lipschitz.

Consider the case

$f(x) = \sqrt{x}$ for $x \in [0,1]$, then $f$ is not Lipschitz, since $\displaystyle\sup_{x \in [0,1]}f'(x) = \displaystyle\lim_{x\to 0} f'(x) = +\infty$.

Also, as an additional note if a function $f$ defined on $S \subseteq \mathbb R$ is Lipschitz continuous then $f$ is uniformly continuous on $S$.

1
On

Lipschitz continuity doesn't say that if you take any old $x,y$ and plot $|f(x)-f(y)|\leq M|x-y|$. It says that $M$ is fixed and the inequality holds for all $x,y \in \mathbb R$ . This is much stronger of a condition. If all you needed to do was pick some $M$ for every choice of $x,y$, the condition would mean nothing.

If your function is differentiable, Lipschitz continuity just says that the function has bounded derivative. I think of this as a "wiggling" and "stretching" bound.

For the stretching, it says that a function $f:\mathbb R \to \mathbb R$ can't grow too fast. For example, if the Lipschitz constant were $1$ and $f(0)=0$, then it is trapped between the two lines $y=x-1,x+1$. This essentially means that it can't grow too quickly, or wiggle too much.

If the constant $M <1$, a function is said to be a contraction which has many nice properties as well.

--

0
On

Definition: f is Lipschitz iff there exists $K>0$ such that for all $x,y\in dom(f)$ we have $$(1)\quad |f(x)-f(y)|<K |x-y|.$$ For $S\subset dom(f)$ we say $f$ is Lipschitz on $S$ to mean that $f$ restricted to the domain $S$ is a Lipschitz function. This is most commonly done when $S$ is an interval or a half-line.

The value of $K$ is not unique, for if (1) holds for all $x,y\in dom (f)$ then it will hold with $K$ replaced by $42K.$ What matters is whether $at$ $least$ $one$ $K>0$ exists such that (1) holds for all $x,y \in dom(f).$

0
On

Lipschitz continuity gives you a bound on how fast things can be pulled appart in the image.

Setup

  • Suppose that $f:X\rightarrow Y$ is a map between metric spaces.

  • Suppose that $\omega:[0,\infty)\rightarrow [0,\infty)$ is continuous and fixes 0. We think of this map as "quantifying the maximum rate at which nearby points can get pulled appart by our map". We call this $\omega$ the modulus of continuity of f.

  • Let x and z be points in $X$.

Explanation If x and z are close, say their distance $d(x,z)$ is $1/10^{1000}$ then their images in $Y$ under f, say, $f(x)$ and $f(z)$ are also close. Moreover, we can say how close they must at-least be; precisely, they are no more than $\omega(1/10^{1000})$ appart.

Likewise, if both points are far, say their distance $d(x,z)$ is $1000000$ then their images in $Y$ under f, say, $f(x)$ and $f(z)$ are no more than $\omega(1000000)$ appart. So these two points can be far appart but no further than $\omega(1000000)$.

If $X$ is compact, eg $[0,1]$ with Euclidean distance, then such an $\omega$ must exist. However, there's generally no guarantee that it's easy to work with. This is where the Lipschitz condition comes in. It postulates that the thing quantifying the worst case stretching, namely $\omega$, is a linear map ($\omega(t)=Kt$ as opposed to something say wierd like $\omega(t)=\max\{t,\sqrt{t}\}$).


Background The Lipschitz condition becomes reasonable, due the the Rademacher-Stephanov Theorem. In the case where X and Y are Euclidean, which states that a function in a compact X is Lipschitz iff it is almost everywhere differentiable with uniformly bounded gradient.