Real Analysis. Differentiation. Check idea.

50 Views Asked by At

Let $f: I \longrightarrow \mathbb{R}$ continuous with $I$ an interval. If, for all $x \in I$, exist $f'_{+}(x) >0$, then $f$ is increasing.

$\textbf{Solution:}$ Let $a < b \in I$. We can take $\delta > 0$ such that $x \in (a, a + \delta) \Longrightarrow f(a) < f(x)$. If $b \in (a, a + \delta)$, the result follows. If not, take $c \in (a, a + \delta)$. We repeat the same process with $c$ and find $c < k < b$ with $b \in (k, k + \delta_{k})$ (since $f$ is continuous), then $f(a) < f(c) < ... < f(k) < f(b)$.

Is the correct ideia?

1

There are 1 best solutions below

0
On BEST ANSWER

Consider that $f$ is not increasing. Then $\exists x_1,x_2\in I :x_1<x_2 $ and $ f(x_1)\geq f(x_2)$.

But then $ \exists x_0\in[x_1,x_2) :f(x_0)\geq f(x)\forall x \in (x_1,x_2) \Rightarrow \lim_{x \to x_0^+ }\frac{f(x)-f(x_0)}{x-x_0}\leq 0$ which contradicts the initial assumption .