Suppose I have some continiously differentiable function $f(a)$, such that $f'(a)>0$ if and only if $a<c$ ($c$ is some positive constant). Is it true that for all $b>0$, $$f(a+b)>f(a) \impliedby a+b<c?$$
My lecturer questioned this assumption. But it seems intuitively true: an epsilon increases in $a$ increase $f(a)$ when $a<c$. A series of epsilon increases should too, provided $a$ remains below $c$. Hence, a discrete change $b$ in the argument should increase $f(a)$ too. Is this true, and if so, why?
By the Mean Value Theorem, there is a point $x_0\in [a,a+b]$ such that $$f'(x_0)=f(a+b)-f(a)$$ But of course $x_0<c$ so $$f'(x_0)>0$$ which implies that $$f(a+b)-f(a)>0$$ and we are done.