Is Newton-Raphson the best we can do if we only know the derivative?

69 Views Asked by At

I was wondering if NR is the fastest method to find a root if all we know about a function is how to evaluate it and its derivative at any point.

Since you can use the first derivative to approximate the second I was wondering if this lets you converge on a root faster (as in $\epsilon_{n+1} \approx \epsilon_n ^3$)?

2

There are 2 best solutions below

0
On BEST ANSWER

I came across an answer to my own question, as detailed in the linked article the following modification of Newton-Raphson does indeed only use the first derivative and converges cubically: https://www.sciencedirect.com/science/article/pii/S0377042703003911

$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n - \frac{f(x_n)}{2f'(x_{n})})}$

5
On

About the only thing you can do to improve on Newton-Raphson is to abandon its habit of forgetting every point you've calculated except the last.

For example, instead of assuming at each step that the function is linear and using that assumption to find the next estimate of the root, you find $x_0$ by guessing, $x_1$ by assuming the function is linear and using the values $f(x_0), f'(x_0)$, $x_2$ by assuming the function is a cubic and using $f(x_0), f(x_1), f'(x_0), f'(x_1)$, then find $x_3$ by assuming the function is a quintic, etc.

But this gives only very minor improvements in convergence speed. It has been too long for me to remember, but I think converge is still quadratic in the limit.

Alternatively, instead of polynomials, you can use other families of functions in hopes of improved performance. Polynomials are not always that great at interpolating or extrapolating. For example Brent's Method, a widely-used root-finding algorithm when you don't even have derivatives, relies on inverse quadratic functions based on three previous values of the function.