Using Newtons method to optimize a non convex function

3.1k Views Asked by At

I want to use newtons method to find the local solutions of a non convex function. I have studied that the hessian must be positive definite to find the solution using newtons method. Can I use newtons method to find local solution of a non convex function as for this function hessian may be negative definite or indefinite as well.

1

There are 1 best solutions below

0
On BEST ANSWER

From the question, I assume that the objective is to minimize a non-convex function.

The problem with using Newton's method to minimize a non-convex function is that it does not distinguish between different types of stationary points (points of zero gradient): local maxima, local minima or saddle points. For instance, consider the non-convex 1-d function $f(x)= -x^2$ (in fact, it is concave), which has a maximum at $x=0$. One iteration of Newton's method takes you directly to the maximum.

In higher dimensions, this behavior is due to the presence of directions of negative curvature of the Hessian. By this, I mean directions pointed to by eigenvectors corresponding to negative eigenvalues. For this reason, the applicability of Newton's method is restricted to cases where the Hessian is positive definite.

There exist variants of Newton's method to minimize non-convex functions. See, for example, the Gauss-Newton algorithm or the Levenberg–Marquardt algorithm. While these algorithms have been historically proposed for the non-linear least squares problem, the ideas can be applied more generally to non-convex optimization.