Consider the problem minimize $f(x)= x^4 −1. $

767 Views Asked by At

I am studying for a test and I found this problem in the textbook that I'm using, there may be a conceptual problem that I'm coming across. My test is on unconstrained optimization (multi-variable) but this problem comes back to the single-variable case:

Consider the problem

minimize $f(x)= x^4 −1. $

Solve this problem using Newton’s method. Start from x0 = 4 and perform three iterations. Prove that the iterates converge to the solution. What is the rate of convergence? Can you explain this?

Okay, so the first part of the problem is pretty straight forward:

I obtain $f'(x)= 4x^3$ and $f''(x)=12x^2$

And substitute it in Newton's method, for the single variable case:

$X_{k+1}=X_k-f'(X_k)/f''(X_k)$

Which simplifies to:

$X_{k+1}=2/3 · X_k$

And has as a general term:

$X_k=(2/3)^k · X_0$

I performed three iterations, which really aren't relevant to the rest of he problem.

To prove that the iterates converge to a solution, I simply used the limit when k tends to infinity of the general term:

$\lim_{k \to +\infty} X_k= X^*$

$\lim_{k \to +\infty} (2/3)^k · X_0=0$

Therefore $X^*=0$

But when I go for the rate of convergence:

$\frac{|X_{k+1}-X^*|}{|X_k-X^*|}=\frac{|2/3|^{k+1}}{|2/3|^k}|=(2/3)$

So it would result in linear convergence, not quadratic as expected from Newton's Method, I've been giving it thought and can't understand why does this happen (or can't find where I made a mistake in my reasoning). I would appreciate if someone could help me out here. Thanks in advance for reading and giving thought.

1

There are 1 best solutions below

0
On

When you're close to $x^* = 0$, which is in fact the unique global minimum in this case, you're essentially solving the equation $f'(x) = 0$. Newton's method for equations can be viewed as a fixed-point iteration $x_{k+1} = g(x_k)$ where in this case $g(x) = \frac23 x$. For a fixed-point iteration to converge, you need $|g'(x^*)| < 1$, which is the case here because $|g'(0)| = ⅔$. The error at iteration $k+1$ can be written $$ e_{k+1} = x_{k+1} - x^* = g(x_k) - x^* \approx g(x^*) + g'(x^*) (x_k - x^*) - x^* = g'(x^*) e_k, $$ because $g(x^*) = x^*$ by definition of a fixed point. Now $g'(x^*) = ⅔$ so the error decreases by a factor of ⅔ at each iteration, and that is called linear convergence.

In the more typical case where $g'(x^*) = 0$, you would need to expand $g$ to second order: $$ e_{k+1} \approx \tfrac{1}{2} g''(x^*) e_k^2, $$ and the error would be roughly squared at each iteration. That's quadratic convergence. It happens for example if, say, $f(x) = x^4 + 4x$. Indeed, in that case, $x^* = -1$, Newton's iteration is given by $$ x_{k+1} = \frac{2}{3} x_k - \frac{1}{3 x_k^2} = g(x_k), $$ and $g'(-1) = 0$, $g''(-1) \not = 0$, so the convergence is quadratic.