Newton's Method in unconstrained optimization fails to converge

339 Views Asked by At

In order to show that Newton's method can produce a sequence of iterates that diverges, an example given in my book is apply Newton's Method to minimize $f(x)={2\over 3}|x|^{3\over 2}$. starting at the initial point $x^{(0)}=1$

And they have given,
$$x^{(k)} =(-1)^k= \begin{cases} 1, & \text{if $k$ is even} \\ -1, & \text{if $k$ is odd} \end{cases}$$

But I don't understand how they have got this.

According to Newton's method,$-f(x^{(k)})=f'(x^{(k)})(x-x^{(k)})$.
$f'(x)= $ \begin{cases} x^{1/2}, & \text{if $x>0$ } \\ -(-x)^{1/2}, & \text{if $x<0$} \end{cases}
right?

So when k=0, I get $x^{(1)}$ as ${-2\over 3}=1(x-1)$
So $x^{(1)}={1\over 3}$.
What have I done wrong? How do I show, $x^{(k)} =(-1)^k$

1

There are 1 best solutions below

0
On BEST ANSWER

You are applying the root-finding Newton's method $x_{n+1}=x_n-f(x_n)/f'(x_n)$, which in fact converges to $0$ in this example (albeit slowly). Generally, this method works okay for $C^1$-smooth convex functions, such as yours.

But the problem was to minimize $f$. The minimizing Newton's method is $x_{n+1}=x_n-f'(x_n)/f''(x_n)$ (i.e., the root-finding applied to the derivative). For the given function, it yields $$ x_{n+1} = x_n - \frac{|x_n|^{1/2} \operatorname{sign}x_n}{\frac12 |x_n|^{-1/2} } = x_n-2x_n = -x_n $$ Hence, the method fails to converge. Generally, it works okay for $C^2$-smooth convex functions, but yours is not $C^2$-smooth.