Computing rate of convergence for fixed point iteration?

575 Views Asked by At

I have written a function to compute the fixed point iteration of $f(x) = x^2 - 3x + 2 = 0$ using $g(x)=\sqrt{3x-2}$. I know that it has linear convergence with constant 0.75, however in my code I am not able to get a similar result for the convergence. I compute my convergence rate as follows: $p = \log{\left( \frac{e_{i+1}}{e_i} \right)} \Big/ \log{\left( \frac{e_{i}}{e_{i-1}} \right)}$

def convergence_rate(x_k,x_0,x_1): #using 2 because that is the root I am finding
    return math.log(abs(x_k-2.0)/abs(x_0-2.0))/math.log(abs(x_0-2.0)/abs(x_1-2.0))

def g2(initial,eps=10**10, iterations=100):
    x_k = initial
    x_k_1 = math.sqrt((3.0*x_k)-2.0)
    i = 0
    x_0, x_1, x_2 = (0,0,0)
    while f(x_k) != 0 and i < iterations:
        x_0, x_1, x_2 = (x_k, x_0, x_1)
        x_k_1 = math.sqrt((3.0*x_k)-2.0)
        x_k = x_k_1
        i+=1
    print(convergence_rate(x_k,x_0,x_1)) #gives 1.0065524232348098
    return x_k #gives 2.0000000000001177
1

There are 1 best solutions below

4
On BEST ANSWER

Your result seems to be right, you get $1$ as order of convergence, which is linear convergence. To get the basis of the geometric series of the error, you need to print also the fractions $e_{i-1}/e_i$.

def convergence_rate(x_0,x_1,x_2): #using 2 because that is the root I am finding
    e = [ abs(x-2.0) for x in [x_0,x_1,x_2] ]
    q_1, q_0 = e[2]/e[1], e[1]/e[0];
    p = math.log(q_1)/math.log(q_0)
    return p, e[2]/e[1]**p

def f(x): return x*(x-3)+2;
def g(x): return math.sqrt(3*x-2)
def g2(initial, eps=1e-12, iterations=100):
    x_0 = x_1 = initial
    x_2 = g(x_0)
    i = 0
    while abs(f(x_2))> eps and i < iterations:
        x_0, x_1, x_2 = x_1, x_2, g(x_2)
        i+=1
        print (i,x_0,convergence_rate(x_0,x_1,x_2)) 
    return x_2 

returning in 20th step

(90, 2.000000000006542 , (0.9999213489596369, 0.7484654875617822))
(91, 2.0000000000049063, (1.0002796845555506, 0.7554822844199645))
(92, 2.0000000000036797, (1.0000932371170645, 0.7517828434452515))
(93, 2.0000000000027596, (0.9996271912885731, 0.7425151389096847))
(94, 2.0000000000020695, (1.000331542705844 , 0.7567197274932704))
(95, 2.000000000001552 , (0.9992266637737944, 0.7343235383224632))

returning the expected factor close to $0.75$.