What are the pros and cons of Newton's Method versus the Relaxed version?

190 Views Asked by At

Given that Newtons method for solving $f(x)= 0$ is: $x_{k+1} = x_{k} - \frac{f(x_{k})}{f'(x_{k})}$. The relaxtion of this method replaces $\frac{1}{f'(x_{k})}$ with some real valued constant y not equal to zero: $x_{k+1} = x_{k} - yf(x_{k})$

Which method converges faster and why?

How do you derive the error equations in terms of consecutive iterations?

Furthermore, given the error equation of an iterative method how do you derive its order of convergence?

1

There are 1 best solutions below

0
On

Use Taylor series in the form $f(x+h) =f(x)+hf'(x)+O(h^2) $.

Then

$\begin{array}\\ \text{Newton}\\ f(x_{k+1}) &=f(x_{k} - \frac{f(x_{k})}{f'(x_{k})})\\ &=f(x_{k}) - \frac{f(x_{k})}{f'(x_{k})}f'(x_k)+O((\frac{f(x_{k})}{f'(x_{k})})^2) \qquad x = x_k, h = - \frac{f(x_{k})}{f'(x_{k})}\\ &=O((\frac{f(x_{k})}{f'(x_{k})})^2)\\ \\ \text{Relaxed}\\ f(x_{k+1}) &=f(x_{k} - yf(x_k))\\ &=f(x_{k}) - yf(x_{k})f'(x_k))+O(yf(x_{k}))^2 \qquad h = yf(x_k)\\ &=f(x_{k}) - yf(x_{k})f'(x_k))+O(yf(x_{k})^2)\\ &=f(x_{k})(1- yf'(x_k)))+O(yf(x_{k})^2)\\ \end{array} $

So Newton goes quadratically and relaxed goes down linearly.