Suppose we have a function $f$ that is defined on a closed interval $[a,b]$.
The following can be proven from the Mean Value Theorem:
If
- $f$ is continuous on the interval $[a,b]$
- $f'(x)=0$ for all $x∈(a,b)$
then $f$ is constant in $[a,b]$
The problem I have with the above theorem is the following:
If there exists a function $f$ such that all of the above conditions for the theorem hold true and additionally, $f'(a)$ or $f'(b)$ are non-zero real numbers, then, according to the theorem, $f$ would be constant for all $x∈[a,b]$.
Obviously, $f(a)=c$ for all $x∈[a,b]$ and $f'(a)≠0$ can't both be true at the same time.
Where is the problem here?
The point of the Mean Value Theorem is that the function need only be continuous on $[a,b]$. It need not even be differentiable at $a$ or $b$ for the conclusion to hold.
However, if in fact $f$ is differentiable at $a$ (or $b$), then the derivative must be $0$. This is because of a theorem which states that the derivative function always has the intermediate value property; thus, it cannot have jump discontinuities. If $f'(a)$ exists and $\lim\limits_{x\to a^+} f'(x)=\ell$, then $f'(a)=\ell$.
EDIT: Spurred on by @MartinR's comment, here's a much more direct proof. Suppose the right-hand derivative $f'_+(a)$ exists. Since we've established that by continuity $f$ must be constant on $[a,b]$, then we have $$f'(a)=\lim_{x\to a^+} \frac{f(x)-f(a)}{x-a} = \lim_{x\to a^+}\frac 0{x-a} = 0.$$