Problem in undergraduate analysis course

48 Views Asked by At

Say $f$ is defined on a neighborhood of $x = a$ on which $f'(a) > 0$. Prove that $\exists\delta > 0$ such that $f(a) < f(x_1)$ for some $x_1 \in (a,a+\delta)$, and $f(x_2) < f(a)$ for some $x_2 \in (a-\delta,a)$.

The only sort of hint the book has given is to use $\epsilon - \delta$ to define $f'(a) = \lim_{h\rightarrow 0} \frac{f(a+h) - f(a)}{h}$ and let $\epsilon = \frac{1}{2}f'(a)$. I'm not exactly sure how to interpret this. Any help, even if it's a nudge in the right direction, would be greatly appreciated.

2

There are 2 best solutions below

0
On

Use the epsilon-delta definition of a limit. For $h>0$ sufficiently small the difference between the fraction and the derivative at $a$ becomes arbitrarily small, hence the difference between $f(a+h)$ and $f(a)$ gets very close to $h$ times a positive constant.

0
On

$f(a)\approx h\cdot f'(a)+f(a+h)$ and $a+h$ in that neighbourhood. If $h>0$ then $f(a)<f(a+h)$ and if $h<0$ it gives $f(a+h)<f(a)$.