I'm reviewing my real analysis and I came across this statement:
Let $f:\mathbb{R}\to \mathbb{R}$ is a differentiable function, and let $c\in \mathbb{R}$. Then if $x\in B(c,\epsilon/f'(c))$, we have $f(x)\in B(f(c),\epsilon)$.
So it says that if $f$ is a differentiable function, then in the $\delta-\epsilon$ argument of continuity, for any given $\epsilon>0$, we may choose $\delta=\epsilon/f'(c)$ to satisfy the condition that $|x-c|<\delta$ implies $|f(x)-f(c)|<\epsilon$.
I tried to prove this statement but I could not prove it in full generality. I would appreciate any help and comments on how to prove this statement.
Here is what I have tried so far: By Taylor's theorem, one has $$f(x)=f(c)+f'(c)(x-c)+O((x-c)^2).$$ Therefore, if $|x-c|<\epsilon/f'(c)$, then one would get $|f(x)-f(c)|<\epsilon+|O((x-c)^2)|$. The second term, which is the higher order term, should be negligible, but I could not eliminate it completely.
The statement, as currently written, is not true.
Consider $f(x) = x^2 + x$ and $c = 0$, so that $f'(c) = 2c+1 = 1$. Then the statement implies that if $|x| \leq \epsilon$, we have $|f(x)| \leq \epsilon$, meaning that $f(x)$ has at most linear growth, impossible for our quadratic function. For a precise counterexample, consider $x=1$ and $\epsilon=1$.