Say $f$ is defined on a neighborhood of $x = a$ on which $f'(a) > 0$. Prove that $\exists\delta > 0$ such that $f(a) < f(x_1)$ for some $x_1 \in (a,a+\delta)$, and $f(x_2) < f(a)$ for some $x_2 \in (a-\delta,a)$.
The only sort of hint the book has given is to use $\epsilon - \delta$ to define $f'(a) = \lim_{h\rightarrow 0} \frac{f(a+h) - f(a)}{h}$ and let $\epsilon = \frac{1}{2}f'(a)$. I'm not exactly sure how to interpret this. Any help, even if it's a nudge in the right direction, would be greatly appreciated.
Use the epsilon-delta definition of a limit. For $h>0$ sufficiently small the difference between the fraction and the derivative at $a$ becomes arbitrarily small, hence the difference between $f(a+h)$ and $f(a)$ gets very close to $h$ times a positive constant.