Newton's Method Convergence Rate when first derivative is non-zero and second derivative is zero

966 Views Asked by At

Using the Taylor expansion it is easy to show that the convergence of Newton's method for a root $\alpha$ is quadratic when $f'(\alpha)\neq0$ and $f''(\alpha)\neq0$. If instead $f'(\alpha)=0$ and $f''(\alpha)\neq0$ we have linear convergence.

What is the case however if $f'(\alpha)\neq0$ and $f''(\alpha)=0$?

I have found using Taylor's Theorem that the limit of $\Big\lvert\frac{x_{k+1}-\alpha}{x_k-\alpha}\Big\rvert $ is $0$ using $f(x_k)=f'(\alpha)(x_k-\alpha)+\frac{1}{2}f''(\xi_k)(x_k-\alpha)^2$ which fails to conclude linear convergence. Surely we need more information about the value of $f'''(\alpha)$ to deduce more about the convergence of the method? Is it possible that the method does not converge in this case?

Any hints would be great!

1

There are 1 best solutions below

0
On

Regardless of any assumptions, the first step of the local analysis of Newton's method for $f$ at least $C^2$ goes like

$$x_{n+1}=x_n-\frac{f'(\alpha) r_n + f''(\alpha) r_n^2/2 + o(r_n^2)}{f'(\alpha)+f''(\alpha) r_n+o(r_n)}$$

since $f(\alpha)=0$, so this gives the two Taylor expansions. When $f'(\alpha) \neq 0$, regardless of anything else, you can then write

$$x_{n+1}=x_n - r_n \frac{1+\frac{f''(\alpha)}{2f'(\alpha)} r_n+o(r_n)}{1+\frac{f''(\alpha)}{f'(\alpha)} r_n + o(r_n)}.$$

Then you can further write

$$x_{n+1}=x_n-r_n \left ( 1+\frac{f''(\alpha)}{2f'(\alpha)} r_n + o(r_n) \right ) \left ( 1-\frac{f''(\alpha)}{f'(\alpha)} r_n + o(r_n) \right ) \\ = x_n - r_n \left ( 1-\frac{f''(\alpha)}{2f'(\alpha)} r_n + o(r_n) \right ) \\ = \alpha + \frac{f''(\alpha)}{2f'(\alpha)} r_n^2 + o(r_n^2).$$

You can still do this "further" computation if $f''(\alpha)=0$. It is just that then the result reads

$$x_{n+1}=\alpha + o(r_n^2)$$

which is not completely telling you the behavior of the error. Instead it is only telling you that the error behaves as $o(r_n^2)$ (i.e. faster than quadratic convergence).