As far as I know, for a given function f(x) if we have to check monotonicity at a point x = a, we check which ones of the following inequalities (or equalities) are being satisfied -
- f(a + h) > f(a) > f(a - h)
- f(a + h) < f(a) < f(a - h)
- f(a + h) > f(a) = f(a - h)
- f(a + h) = f(a) > f(a - h)
- f(a + h) < f(a) = f(a - h)
- f(a + h) = f(a) < f(a - h)
- f(a + h) > f(a) < f(a - h)
- f(a + h) < f(a) > f(a - h)
- f(a + h) = f(a) = f(a - h) for small h
My confusion lies here. Does the value of the function's derivative at x=a have anything to do with its monotonicity at x=a?
For example, let's discuss the monotonicity of y = x² at x = 0. Using the definitions stated above, it is easy to observe that f(0+h) > f(0) and f(0-h) > f(0) but f'(0) = 0.
Could someone please help clear the confusion, and tell what's the real definition for monotonicity at a point and also why the two above methods don't agree with each other?
Thanks a lot!
It does tell you the monotonicity when the derivative is non-zero. When it is zero, it's a critical point, so we can't determine if it's monotone or not.
In your example, you're totally correct, it's inconclusive since for $f(x) = x^2$, $f'(0) = 0$, and it's not monotone. However $g(x) = x^3$, $g'(0) = 0$ but it's monotone.
So the idea is that for a continuous $f(x)$ that's differentiable on an interval containing $a$ (please correct me if this is not the exact condition), $f'(a) \neq 0 \implies $ $f(x)$ monotone around $x=a$