This was a proof given for Rolles Theorem : Let $f$ be continuous on $[a, b], a<b$, and differentiable on $(a, b)$. Suppose $f(a)=$ $f(b)$. Then there exists $c$ such that $c \in(a, b)$ and $f^{\prime}(c)=0$.
Proof: If $f$ is constant on $[a, b]$ then $f^{\prime}(c)=0$ for all $c \in[a, b]$. Suppose there exists $x \in(a, b)$ such that $f(x)>f(a)$. (A similar argument can be given if $f(x)<f(a))$. Then there exists $c \in(a, b)$ such that $f(c)$ is a maximum. Hence we have $f^{\prime}(c)=0$.
By what can we say that if at any point f(x)>f(a) we are surely having a c in the region (a,b) which is the maximum? Is it becauss of the fact its being continuous and bounded hence it must have a local maxima/minima , problem doesnt state its bounded but if its not that then continuity will not apply isnt in some interval ?