I would like some input in my proof for this question. I solved using two different approaches, and would like to know where can I improve. I'm specially interested in a proof that does not use the fact that a strictly increasing function has $f'(x) \gt 0, \forall x \in I$.
Let $a \lt b, [a,b] \in \mathbb{R}$ and $f:[a,b] \to \mathbb{R}$ be continuous in $[a,b]$. Suppose that there are no interior points that are local extreme. Show that $f$ is strictly increasing or decreasing.
$\textbf{Answer 1}$: Suppose by contradiction that $f$ is not strictly increasing. Let $x,y,z \in [a,b]$ and assume that $x \lt y \lt z$. Therefore, $f(x) \lt f(y) \gt f(z)$. Consider the closed interval $[x,z]$. Since $f(y)$ is greather than both $f(x)$ and $f(z)$, neither $f(x)$ or $f(z)$ can be the maximum value of $f$ over $[x,z]$. However, since the interval is compact, $f$ attains its maximum over this interval. Therefore, $\exists t \in [x,z]: f(t) \gt f(w), \forall w \in [x,z]$. Since $t$ is an interior point of $[x,z]$, it is an interior point of $[a,b]$, hence it is a local maximum of $f$ in $[a,b]$, which is a contradiction. The proof is analogous for the case where $f$ is strictly decreasing.
$\textbf{Answer 2}$: Since $\nexists c \in (a,b):f'(c)=0$, $f'(c)\gt0$ or $f'(c) \lt0, \forall c \in (a,b)$ . Without loss of generality, assume that $f'(c) \gt 0, \forall c \in (a,b)$. Consider the closed interval $[x_{1},x_{2}] \subseteq (a,b)$, with $x_{1} \lt x_{2}$.By the Mean Value Theorem, $\exists d \in (x_{1},x_{2}):f'(d) = \frac{f(x_{2})-f(x_{1})}{x_{2}-x_{1}}$. Since $f'(d) \gt 0 \implies f(x_{2}) \gt f(x_{1})$, therefore, $f$ is strictly increasing.
Your second answer assumes a hypothesis that isn't part of the problem---that $f'$ exists in $(a,b)$---and is technically incorrect.
Your initial argument in the first answer is incorrect. If $f$ fails to be strictly increasing there is no reason that it must ever decrease.
The first thing to note is that $f$ is one-to-one. If $x,y \in [a,b]$ and $x < y$, then the extreme value theorem guarantees that $f$ attains both minimum and maximum values on $[x,y]$, and your hypothesis implies both are attained at the endpoints. If it were the case that $f(x) = f(y)$, the common value would be both the maximum and the minimum of $f$ on $[x,y]$, telling you that $f$ is constant on $[x,y]$. This is a contradiction, since then $f$ would have an extreme value at every $z \in (z,y)$. Thus $f(x) \not= f(y)$.
It is a very well-known fact (and a simple corollary of the intermediate value theorem) that if $f$ is continuous and one-to-one on an interval $[a,b]$ then $f$ is strictly monotone on $[a,b]$, which is what you need to show.