Newton's method: convergence

68 Views Asked by At

Suppose $f$ is differentiable in the interval $[a,b]$ and $x_0 \in [a,b]$. Given the Newton's method sequence $x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}$, with $x_n \in [a,b] \forall n$,show that if the sequence $x_n \rightarrow r$, then $f(r)=0$

The hypothesis given are really sufficient to prove this statement? What I could do so far is the following:

$\lim x_{n+1}=r=\lim \left [x_n-\frac{f(x_n)}{f'(x_n)} \right ]$

But I don't know how to proceed if $f'$ is not continuous.