What is the convergence rate of Regula-Falsi and Newton methods?

15.4k Views Asked by At

everybody, I'm studying different methods like bisection, secant, newton and Regula_Falsi. For another application, I need to know the convergence factor of these methods. Searching online I saw that for the method of bisection it corresponds to $1/2$, for the Regula-Falsi $\frac{1+\sqrt{5}}{2}$. I found no explicit values for the other methods. Could you help me understand why? Is there any way to calculate these corresponding values for the other methods? A formula would be good too. Since I have to use these values in a Python code, I could use the formula to calculate them on the fly.

2

There are 2 best solutions below

3
On

When Newton's method converges for a typical problem, it exhibits $q$-quadratic convergence, i.e., $$ \|e_{n+1}\|\propto \|e_{n}\|^2, $$ where $e_k$ is the error at the $k$-th iteration. However, there are many things that can change this rate of convergence such as a near-singular derivative at the solution, which results in $q$-linear convergence (if you're lucky). If your second derivative is $0$ at the solution, then you may get $q$-cubic convergence. However, if your initial guess is in a bad place (bad is technical to define), then the iteration may not converge at all.

0
On

It seems you are confusing order/rate of convergence. See here for the definition. Intuitively, they can be understand by considering the amount of digits accurate per iteration. Let $d_n$ be the digits accurate of the estimate. Usually, $d_n$ behaves an approximate recurrence of the form

$$d_{n+1}\simeq qd_n+r$$

where $q$ is the order of convergence and $r$ is (related to) the rate of convergence. Initially the rate of convergence matters the most when $d_n$ is small, while for large $d_n$ the order of convergence will matter more.

The orders and rates of convergence for each of the stated methods is given by:

$$\begin{array}{c|c|c|c|c}&\rm Bisection&\rm Regula~Falsi&\rm Secant&\rm Newton\\\hline\rm Order&1&1&\frac{1+\sqrt5}2&2\\\hline\rm Rate&\frac12&-\frac{f''(x)}{2f'(x)}(x-z)&-\frac{f''(x)}{2f'(x)}&-\frac{f''(x)}{2f'(x)}\end{array}\\\small\text{$z$ denotes the second point used in Regula-Falsi}$$

Bisection is very easy to prove, since the interval always halves. The rates of convergence for the other methods are all mostly the same, since $-f''(x)/2f'(x)$ is a measurement of the curvature of $f$, or more precisely how accurate a linear approximation of the function is. The differences in their orders of convergence, however, is more complicated. In essence, they rely on the fact that the equation

$$y-\frac{f(y)(y-z)}{f(y)-f(z)}$$

estimates the root $x$ with error given by

$$-\frac{f''(x)}{2f'(x)}(x-y)(x-z)$$

In the case of regula falsi, either $x-y$ or $x-z$ remains constant. Letting the constant one be $x-z$, the error reduces down to

$$\underbrace{-\frac{f''(x)}{2f'(x)}(x-z)}_\text{(rate)}(x-y)^{1~(\text{order})}$$

In the case of the secant method, both $y$ and $z$ approach the root, leading to $\log|x-y|$ growing similarly to that of the Fibonacci sequence, which can be solved to give an order of convergence of $\varphi=(1+\sqrt5)/2$, the golden ratio.

In the case of Newton's method, $z\to y$ is used, leaving us with

$$\underbrace{-\frac{f''(x)}{2f'(x)}}_\text{(rate)}(x-y)^{2~(\text{order})}$$