My professor said that if we have a twice continuously differentiable real function $f$ in an interval $[a,b]$ such that:
- $f(a)f(b)<0$
- $f'$ and $f''$ don't change signs and $f'$ does not vanish
If $f(a)f''(a)>0$ or $f(b)f''(b)>0$ then starting at $x_0=a$ or $x_0=b$ respectively then Newton's method converges.
I suspect this is false, but I can't come up with a counterexample.
Could anyone find one?
This is corrct : it is Darboux theorem (have a look here for the original paper).
Have another one here (it is in French but it is easy to read).
It is very important for numerical analysis and computing.