Newton's method linear convergence proof

7.4k Views Asked by At

How would you show that if f'(a)=0 then the Newton's Method is linear convergent when
1. $f''(a)\neq 0$
2. $f''(a)=0, f'''(a) \neq 0$?
I am having some trouble getting it to the point where you can take the limit of ratio of the error terms and using L'hospital's rule to prove this.

1

There are 1 best solutions below

5
On BEST ANSWER

Write $f(x)=(x-a)^2g(x)$ resp. $f(x)=(x-a)^3g(x)$, both with $g(a)\ne0$ and compute the influence of these factorizations on the Newton iteration.


If $f(x)=(x-a)^mg(x)$ with $g(a)\ne 0$ then $f'(x)=m(x-a)^{m-1}g(x)+(x-a)^mg'(x)$ and $$ x_+=x-\frac{f(x)}{f'(x)}=x-\frac{(x-a)g(x)}{mg(x)+(x-a)g'(x)}=a+(x-a)\frac{(m-1)g(x)+(x-a)g'(x)}{mg(x)+(x-a)g'(x)} $$ For $x$ close to $a$ the term $(x-a)g'(x)$ becomes very small relative to $g(x)$, and the Newton iteration reduces to $$ x_+=a+(1-\tfrac1m)(x-a). $$ For a strict proof you have to quantify "becomes very small relative to" and use inequalities for $|x_+-a|\leqslant C|x-a|$, but the general idea is as above.