A consequence of Lipschitzianity in $[a, b]\subset\mathbb{R}$

57 Views Asked by At

I have to the following statement. Let $g:[a, b]\subset\mathbb{R}\longrightarrow\mathbb{R}$. So $$ L(g, [a, b])=\frac{|g(b)-g(a)|}{|b-a|}\Longrightarrow \ g(a+t(b-a))=g(a)+t(g(b)-g(a))\ \ \text{for}\ 0\leq t\leq1, $$ where $L(g, [a, b])$ is the Lipschitz constant of $g$ in $[a, b]$.

How can I show this? Some helps?

Thank You

1

There are 1 best solutions below

4
On BEST ANSWER

Hint: The hypothesis says that $|g(b)-g(a)|$ is as large as it can possibly get, given the Lipschitz constant. Draw a picture, and imagine what happens if the graph of $g$ is not a straight line between $a$ and $b$. It has to be steeper than $L$ somewhere?

Edit: Here is a second hint. With $a<t<b$, put $x=a+t(b-a)$ and $y=g(a)+t(g(b)-g(a))$. You want to prove that $g(x)=y$. The two inequalities $|g(x)-g(a)|<L(x-a)$ and $|g(b)-g(x)|<L(b-x)$ become four inequalities if you undo the absolute values on the left. Perhaps you can use them to prove that $g(x)\le y$ and $g(x)\ge y$?