When to use Newtons's, bisection, fixed-point iteration and the secant methods?

3k Views Asked by At

I've been introduced more or less to these methods of finding a root of a function (a point where it intersects the $x$ axis), but I'm not sure when they should be used and what are the advantages of one method over the other.

I think that it would be nice to have an answer that puts them in comparison and shows the situations where one method would be more advantageous than the other, etc.

So, my question are:

  • When should we use one method over the other and why?

  • And, what are the advantages and disadvantages therefore of one method over the other?

Also, if you want to provide a brief explanation of each of the method, I think that the answer would be more complete and interesting.

2

There are 2 best solutions below

0
On

You should never seriously use bisection.

If you think that derivatives are hard, use the secant method.

If you want to force convergence and can find intervals with opposite signs of the function, then use one of the anti-stalling variants of regula falsi. If you think that convergence could be faster, use a method based on the inverse quadratic like Muller or Brent.

If derivatives are not so hard, then use Newton. To encourage convergence, combine with line-search.

0
On

I'm of an opinion a bit different from that of Lutz Lehmann's. Here are my thoughts concerning root-finding in 1 dimension:

TL;DR although no method is best and each has their drawbacks, it is often the case that another method can be used to counteract the drawback. The drawbacks to this mindset are either a necessary understanding of the provided function (will Newton's method work well here?) or more complicated code combining multiple methods (which method to be used each iteration?).

  • You should never use bisection on its own, unless you are absolutely certain the root cannot be linearly approximated nicely, a rather extreme case where no method outperforms bisection. Even if you believe this may be the case, you may want to look at the following points.

  • You should, whenever possible, use bisection with other methods, even if you believe the root may not be linearly approximated nicely. Such a method is called a hybrid method, and can be used to guarantee convergence in the event that something like Newton's method is not converging fast enough, or worse, diverging. See Newt-safe for example.

  • For bracketing methods (methods which bound the root like bisection, which should be used whenever possible), it is often better to first try something else before resorting to bisection (which should be a worst case resort). Lutz's answer mentions several of these.

  • Newton's method should be reserved for cases when computing $f(x)/f'(x)$ is quite easy (such as for a polynomial). Otherwise it is probably simpler to just use the secant method.

  • Fixed-point iteration should never be used outside of a theoretical situation. There not many advantages here aside from being marginally simpler to use for really simple or specific situations.

It is worth noting that although the second and third points are desirable, they are not always necessary. You may be able to deduce beforehand that Newton's method is perfectly fine on its own without any of the additional considerations above. On top of this, although the concept of a hybrid method sounds enticing, it can lead to significantly more complicated code. In other words, this may be quite overkill for the given problem.