Prove Newton's iteration will diverge for these functions, no matter what real starting point is selected. $f(x)=x^2+1$ and $g(x)=7x^4+3x^2+\pi$.

3.4k Views Asked by At

Prove Newton's iteration will diverge for these functions, no matter what real starting point is selected.

$f(x)=x^2+1$ and $g(x)=7x^4+3x^2+\pi$.

We know that $f(x)>0$ and $g(x)>0$ for all $x\in \mathbb{R}$, so there does not exist a real solution to these polynomials and Newton's Method doesn't apply. Is this basically what the proof of this is or do I need to do it a different way? I don't see how to do this any other way. Any solutions or hints are greatly appreciated.

4

There are 4 best solutions below

0
On

The recurrence for Newton iteration is $$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$$ If $x_n$ converges to some limit $L$, taking limits in both sides of this equation you have $$L = L - \lim_{n \rightarrow \infty} \frac{f(x_n)}{f'(x_n)}$$ In other words, $$\lim_{n \rightarrow \infty} \frac{f(x_n)}{f'(x_n)} = 0 \tag 1$$ If $f'(L) \neq 0$, then you have $$ \frac{f(L)}{f'(L)} = 0$$ In other words $f(L) = 0$. Since both of your functions have no zeroes, this case won't happen.

So you have $f'(L) = 0$. What remains is to show $(1)$ is impossible in this situation. But now $f(x_n)$ converges to $f(L)$, while $f'(x_n)$ converges to zero. Since in both cases $f(x)$ never vanishes, you have $f(L) \neq 0$ and the ratio $\frac{f(x_n)}{f'(x_n)}$ diverges as $n$ goes to infinity. Hence $(1)$ doesn't occur.

The same argument shows that Newton iteration diverges whenever the function is nonvanishing.

0
On

You are right to point out that if for some function there are no roots, then Newton's method for that function will obviously not converge. However, both $f$ and $g$ have complex roots, so Newton's method will probably converge for both of them to some of their complex roots. As a result, you should add that the question is about real limits only of the Newton's method.

It seems to me that you are asked to examine the method itself for the functions given. That is, for $f(x)=x^2+1$, for example, you are asked to show that however you choose the initial point $x_0$, if you define

$$x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}$$

then necessarily the sequence $x_n$ will not converge. Indeed, assume $x_0\neq 0$. Since $f'(x)=2x$, you will have, for $n\geq 0$:

$$x_{n+1}=x_n-\frac{x_n^2+1}{2x_n}=\frac{x_n^2-1}{2x_n}$$

If the limit $\lim_n x_n=L$ exists and is different from zero, then passing to the limit on both sides of the Newton's method equation you get:

$$L=\frac{L^2-1}{2L}$$

whence $L^2=-1$, so $L$ is not real. If $L=0$, then the right hand side of the equation, namely $\frac{x_n^2-1}{2x_n}$ tends to $-\infty$, contradicting the fact that the left hand side tends to zero. Finally, if you start with $x_0=0$, then already $x_1$ is undefined.

A similar argument may be used for the second function as well.

3
On

Newton's method "applies" in the following sense: if $f$ is a real-valued differentiable function on the real line, and $x_0$ is any real starting point, then we can define the sequence $x_0, x_1, x_2, \ldots$ by $x_{n+1} = x_n - f(x_n)/f'(x_n)$ (unless we are unlucky enough to reach a point where $f'(x_n) = 0$, in which case we have to stop).

Now suppose the sequence converges: that means we don't stop, and $\lim_{n \to \infty} x_n = L$ exists as a real number. Suppose further that $f'$ is a continuous function (which is certainly the case in your examples). Then as $n \to \infty$ we have $f(x_n) \to f(L)$ and $f'(x_n) \to f'(L)$. But also $f(x_n) = f'(x_n) (x_{n+1} - x_n) \to 0$. Thus we must have $f(L) = 0$.

To put it another way (the contrapositive): if $f(x) = 0$ has no real solution, such an $L$ can't exist, i.e. the sequence must either stop or diverge.

The question is not quite correct in that it does not take into account the possibility of the sequence stopping (which is not normally included in the meaning of "diverge"). That happens if you take $x_0 = 0$ in either of your examples.

0
On

In justifying a failure to converge, we could use the condition $$\frac{f(x)}{f'(x)} = 0.$$ But consider $$f(x) = (x-1)^{1/3}.$$ This function obviously has the root $x = 1$. And the above criterion applied to this choice easily gives $$3(x-1) = 0,$$ or $x = 1$. Yet the iteration of $$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}$$ for any choice $x_0 \ne 1$ is in fact divergent. To see this, we note that for this choice of $f$, $$x_{n+1} = x_n - 3(x_n - 1) = -2x_n + 3.$$ While $x = 1$ is a fixed point, it is a repelling point. For if $x_n = 1 + \epsilon$, we get $$x_{n+1} = -2(1+\epsilon) + 3 = 1 - 2\epsilon,$$ and $$x_{n+2} = -2(1-2\epsilon)+3 = 1+4\epsilon.$$ Thus if $\epsilon \ne 0$ represents an initial error term, the absolute error from $1$ doubles with each iteration.

When we look at Newton's method for complex-valued functions, the dynamics are even more rich; we can get limit cycles, chaos, etc. While it may be sufficient to show that the first condition $f/f' = 0$ is not satisfied, it is not a necessary criterion to demonstrate a failure of the sequence to converge.