Exploit Hessian definiteness property in optimization

85 Views Asked by At

I would like to perform some minimization of a multivariate (most likely non-convex) function. Since I am capable of computing the gradient and the Hessian I like to apply Newton's method.

Now, in every optimization step I also check the Hessian's definiteness by performing an Eigenvalue decomposition. My observation is, that I always have mostly positive, but also some negative Eigenvalues.

Does that mean, that I will always end up in a saddle point? Is there any possibility to exploit my knowledge about negative and positive eigenvalues in a sense that I always walk down-hill and never up-hill?

Many thanks in advance

1

There are 1 best solutions below

0
On

If the function is not convex Newton's algorithm may diverge but not always, also you will not have the quadratic convergence of the method.

As you said you could use the gradient descent algorithm if the the function is not convex.

A better approach is to use quasiNewton method, this way the Hessian os constructed and it's always positive definite.