Lipschitz Condition............

479 Views Asked by At

I am independently studying Numerical analysis and came across a question for which I am stuck at. Assume that $g(x)$ is differentiable. Show that if $|g'(x)|<1$ over $[x_0-p, x_0+p]$, then $g(x)$ satisfies $|g(x)-g(x_0)|\le \lambda |x-x_0|$ and $0\le \lambda < 1$.
Any help is much appreciated.

2

There are 2 best solutions below

0
On

Hint Use the Mean Value Theorem (also known as Lagrange's theorem).

0
On

By the Mean Value Theorem we have that, $$ g(x)-g(x_0) = g'(\xi)(x-x_0) $$ for some $\xi\in [x_0-p,x_0+p]$. We then have that $$ |g(x)-g(x_0)| \leq \lambda |x-x_0|. $$ Since $0\leq|g'(x)|<1$. This is like you said known as the Lipshitz Condition. This condition is useful in Numerical Analysis because we can prove fixed point theorems with it.