Let $a, \space b \in \mathbb{R}$ with $a < b$ and consider $f:[a,b] \rightarrow \mathbb{R}$ s.t $f$ is continuous at $[a, b]$ and Lipschitz continuous at $(a, b)$. Does it necessarily imply $f$ is Lipschitz continuous at $[a, b]$?
Although this seems obviously true, I don't see how I can prove it using $f$ continuity. I'd like if someone can guide me to develop some intuition, .e.g, providing a specific case of this property.
Thanks!
Let $L$ be a Lipschitz constant for $f$ on $(a,b)$. Take $x\in(a,b)$. Let $\varepsilon>0$, and use the continuity of $f$ to choose a $\delta>0$ such that
$$\lvert f(a)-f(\xi)\rvert\leq\varepsilon$$
for all $\xi\in(a,a+\delta]$. Then, for any $\xi\in(a,a+\delta]$,
$$\lvert f(a)-f(x)\rvert\leq\lvert f(a)-f(\xi)\rvert+\lvert f(\xi)-f(x)\rvert\leq\varepsilon+L\lvert \xi-x\rvert.$$
Letting $\xi\downarrow a$ we get that
$$\lvert f(a)-f(x)\rvert\leq L\lvert a-x\rvert+\varepsilon.$$
As $\varepsilon$ was arbitrary,
$$\lvert f(a)-f(x)\rvert\leq L\lvert a-x\rvert.$$
A similar argument can be used for the remaining case.