Conditions for Convergence of Newton's method

70 Views Asked by At

My professor said that if we have a twice continuously differentiable real function $f$ in an interval $[a,b]$ such that:

  1. $f(a)f(b)<0$
  2. $f'$ and $f''$ don't change signs and $f'$ does not vanish

If $f(a)f''(a)>0$ or $f(b)f''(b)>0$ then starting at $x_0=a$ or $x_0=b$ respectively then Newton's method converges.

I suspect this is false, but I can't come up with a counterexample.

Could anyone find one?

1

There are 1 best solutions below

0
On

This is corrct : it is Darboux theorem (have a look here for the original paper).

Have another one here (it is in French but it is easy to read).

It is very important for numerical analysis and computing.