A proof that Newton method converges

39 Views Asked by At

I am asked to write down the Taylor series for a function $f$ evaluated at $x + h$ in terms of $f(x)$ and its derivatives evaluated at $x$. Then, to use this result to show that if $x_0$ is an approximate solution of the equation $f(x) = 0$, then a better approximation is given, in general, by $x = x_1$ where $$ x_1=x_0-\frac{f(x_0)}{f'(x_0)}. $$ My attempt: For $x=x_0$, we have that (excluding higher order terms) $$ f(x_0+h)=f(x_0)+f'(x_0)(x_0+h-x_0)=f(x_0)+f'(x_0)h $$ With $x_1=x_0+h$ and setting the RHS equal to $0$ (since $x_0$ is an approximate solution of $f_(x)=0$), we get $$ f(x_0)+f'(x_0)(x_1-x_0)=0 $$ and so $$ x_1=x_0-\frac{f(x_0)}{f'(x_0)}. $$ However, this does not necessarily show that $x_1$ is a better approximation, right? What am I missing?

1

There are 1 best solutions below

1
On BEST ANSWER

Let $z$ be the solution of $f(x)=0$. Taylor's formula gives you that $$f(z) = f(x_0) +f'(x_0) (z- x_0) + \frac 12 f''(\xi)(z-x_0)^2 $$ So, $$ 0 = \frac{f(x_0)}{f'(x_0)}+z+-x_0+\frac{f''(\xi)}{2f'(x_0)}(z-x_0)^2 \Rightarrow $$ $$ |z - x_1| = \frac{|f''(\xi)|}{2|f'(x_0)|}|z-x_0|^2. $$

The approximation does have the potential to be better than $x_0$, as the error of $x_1$ is similar to the square of the error of $x_0$ but, as you say, the proposed conditions do not guarantee this. If $|z-x_0|$ or the coefficient $\frac{|f''(\xi)|}{2|f'(x_0)|}$ are large, $x_1$ may turn out to be a worse approximation than $x_0$.