Why does $f$ need to be continuous in this theorem? Continuity is already implied by differentiability for $(a,b)$, right? Why do we bother with a stronger prerequisite?
Let $f$ be a real function which is continuous on the closed interval $[a,b]$ and differentiable on the open interval $(a,b)$.
- If $\forall x \in (a,b): f'(x) \ge 0$, then $f$ is increasing on $[a,b]$.
- If $\forall x \in (a,b): f'(x) \gt 0$, then $f$ is strictly increasing on $[a,b]$.
- If $\forall x \in (a,b): f'(x) \le 0$, then $f$ is decreasing on $[a,b]$.
- If $\forall x \in (a,b): f'(x) \lt 0$, then $f$ is strictly decreasing on $[a,b]$
Consider e.g. the function
$$ f(x) = \begin{cases} 0&, 0 \leq x < 1\\ -1&, x = 1\end{cases}$$
on the interval $[0,1]$. Then you have $f'(x) \geq 0$ on $(0,1)$, but still $f$ is not increasing on $[0,1]$ because $f(1) < f(x)$ for all $x \in [0,1)$. The problem is that $f$ is not continuous at the endpoint $x = 1$. Similar counterexamples work for the other three points of the theorem.