What is better than Newton's Method in finding root-finding for polynomials?

2.8k Views Asked by At

If we have a polynomial function $f(x)= x^4 - x^3 - x^2 - x + 1$ and we have xts be a solution of $f(x) = 0$ on $[0, 1],$ I was wondering if there is a method that converges xts even faster than Newton's Method?

I am aware of the bi-sectional and regula falsi methods but I know Newton's method surpasses those two methods. I read a paper/lecture where the secant method has a better performance than Newton's Method but I just wanted to ask the community if they know of another method.

One of my classmates suggested either Laguerre's method or Halley's method but I'm not entirely convinced.

Link to the paper/lecture I found comparing secant and Newton's methods: http://pages.cs.wisc.edu/~amos/412/lecture-notes/lecture05.pdf

Thank you!

1

There are 1 best solutions below

0
On

To summarize the comments:

Exact solution

The polynomial is of 4th degree and symmetric, thus can be expressed in $y=x+x^{-1}$ giving $$ x^2-x-1-x^{-1}+x^{-2}=(y^2-2)-y-1=\frac14((2y-1)^2-13) $$ These nested quadratic equations are easily solved to give the roots, among them the one cited by Simply Beautiful Art.

Numerical methods

There are "naked" iteration methods like Newton, secant method, Halley, Laguerre, where any claim on convergence order has to have the assumption "if it converges". Then there are bracketing methods like bisection, regula falsi, Dekker, Muller, Brent, where convergence to a real root is guaranteed if the iteration starts with a bracketing interval.

All the mentioned methods are "one root at a time", to find all roots one has to sample a sufficiently large set of initial points (with aggressive break conditions if the iteration jumps too far or the residual does not reduce). In the case of polynomials one can also apply deflation, removing the linear factors corresponding to the already known roots.

Order of convergence

The question and cited paper measure the order of convergence relative to the effort, which usually is dominated by the number of function evaluations.

The secant method and all bracketing methods take one function evaluation per step. Newton uses 2, the third order methods Halley and Laguerre need 3 function and derivative evaluations per step. Thus Newton has order $\sqrt2\approx1.41$ (on average) per function evaluation, Halley $\sqrt[3]3\approx1.44$. The higher order so-called Householder methods (I took the name from an isolated review paper, there is no proper source for that name) mentioned by Yiyuan Lee have a lower order per function evaluation, as $\sqrt[n]n$ falls to $1$ from $n=3$ on.

Bisection and "pure" regula falsi have order one. Regula falsi with anti-stalling measures like the Illinois variation has order $\sqrt[3]3$, is thus comparable to the third-order methods. The secant method has order $\phi=\frac{1+\sqrt5}2\approx1.6$. The order of Brent's method is at $\approx1.8$.

Newton's method and some other second order methods can be elevated to equivalence with a third order method by augmenting it with a simplified step, that is, a step reusing the derivative value from the previous step.