Will this combination of Newton Raphson and bisection method be useful?

332 Views Asked by At

I want to write a program to find the roots of an equation of a single variable. Will the following steps be useful? Iteratively use the Newton Raphson method. At any stage during an iteration:

  1. If we find a root, then report the result and stop. Otherwise, go to step 2.

  2. If we find two points x1 and x2 at which the function values have opposite signs, then find a root between x1 and x2 using the bisection method. Otherwise, go to step 3.

  3. If we find two points x1 and x2 at which the gradients are of opposite signs, then find a point between x1 and x2 at which the gradient is zero. Suppose this point is x3. If f(x3) is of opposite sign compared to f(x1) and f(x2), then there is one root between x1 and x3 and another root between x2 and x3. We find both using the bisection method. If the three are of the same sign, then we report that the root was not found and stop. Otherwise, go to step 4.

  4. If the gradient becomes constant, this means we have found an asymptote. Report that we could not find a root and stop.

In all other cases, continue with the iteration.

Will this be useful? Or, will it find some wrong root in some cases? Any suggestions for improvements? Thank you.

1

There are 1 best solutions below

0
On

The outlined method will work mostly, but of course may report no root was found even when there was a root on the given interval.

A particularly notable point is what constitutes a root.

  • Is it enough that a root is bounded using bisection? What if bisection reports a value where $f(x)$ is not close to $0$ at all e.g. $f(x)=1/x$ giving a value $x\approx0$ with bisection.
  • What if step 3 finds a point where $f(x)=0$? What if it's very close to $0$? Should this also be considered a root?

It also seems you want to stop at the 4th stop. I'm not entirely sure what is meant for that case, but it may prove a better idea to continue using Newton's method.

Finally I'll also remark that for an algorithm like this, it is usually useful to double (or multiply by some amount greater than 1) the distance that Newton's method covers:

$$x_{n+1}=x_n-2\frac{f(x_n)}{f'(x_n)}$$

This allows for

  • Jumping over to the other side of a root so that bisection can be used to guarantee convergence.
  • Potentially faster initial convergence.
  • Potentially faster convergence for non-simple roots (such as $f(x)=x^3$).

but may also pose other issues, such as missing the root or getting stuck (such as $f(x)=|x|$).