Rate of convergence of an iterative root finding method similar to Newton-Raphson

788 Views Asked by At

We are defining an algorithm as follows:

Let $f(x)$ be a function with a root in $[a,b]$. We define a series $\{x_k\}_{k=1}^{\infty}$ as follows: $x_{k+1}=x_k-f(x_k)\frac{b-a}{f(b)-f(a)}$.

Does $\{x_k\}_{k=1}^{\infty}$ converge to a root of $f(x)$? If so, how quickly?

This seems somewhat similar to Newton-Raphson, and that doesn't always converge, so I'm thinking this doesn't converge either. But perhaps I am wrong? How do I show this?

1

There are 1 best solutions below

0
On

Provided that the estimated derivative is of the same sign as the actual derivative and no less than half its magnitude, then the convergence is guaranteed (provided you start close enough) and is linear. This can be easily proven by recognizing the problem is of the form $x_{n+1}=\Phi(x_n)$, which is fixed-point iteration. Computing $\Phi'(x)$ will then tell you what you need to know.

Tangentially related, it is interesting to note that the modified Newton method which uses the same estimated derivative repeatedly is not completely novel. In fact, it is often used in the higher order case, where recomputing the Jacobian and inverting it can be very time consuming.