Given $f'(a) > 0$, prove that $f$ is increasing on $(a - \delta, a + \delta)$.

580 Views Asked by At

Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a function. Suppose that $f'(a) > 0$ in an interior point $a$ of the domain. Prove that there exists a $\delta > 0$ such that $f$ is increasing on the open interval $(a - \delta, a + \delta)$.

My attempt: Suppose $f'(a) = L > 0$. This means that $$ \lim_{h \to 0} \frac{ f(a + h) - f(a)}{h} > 0. $$ By definition, then, for all $\epsilon > 0$ there exists a $\delta > 0$ such that $$ 0 < | h | < \delta \Rightarrow \bigg| \frac{ f(a + h) - f(a) }{h} - L \bigg| < \epsilon. $$ If we choose $\epsilon = L > 0$, then this means that $$ \frac{f(a +h) - f(a) }{h} > 0 \qquad (*) $$ whenever $ 0 < |h| < \delta$. I now need to prove $f$ is increasing on $(a - \delta, a + \delta)$. Let $x,y \in (a - \delta, a + \delta)$, and suppose that $x \leq y$. Then we need to prove that $f(x) \leq f(y)$. I want to somehow use (*) for this, but I'm not sure how.

2

There are 2 best solutions below

2
On

Suppose that $f$ is $C^1$, (continuously differentiable), if $f'(a)>0$, it implies that there exists $c:$ $f'(x)>0$ for $x\in [a-c,a+c]$. Then for $u<v,\in [a-c,a+c]$, apply the mean value theorem $f(v)-f(u)=f'(w)(v-u)>0, w\in [u,v]$.

In general if $f$ is not $C^1$, the function may not be increasing has point out some comments

0
On

Usually one defines $f$ to be increasing on an non-empty interval $J$ if for all $x,y\in J$ it holds that $x< y\Rightarrow f(x)\le f(y)$. But you can't deduce that solely from $f'(a)>0$. Take as example $$f(x):=\begin{cases} \tan(x),&\text{if $x$ is rational;}\\ x,&\text{else.} \end{cases}$$ where $a=0$.

What can be shown instead is $$x<0<y\Rightarrow f(x)<f(y),$$ which is a sort of punctual comparison, not a local one.