Constrained optimization for non-convex function.

20 Views Asked by At

I have a non-convex function say $f(x,y)$. I'm trying to find the minimum value of $y$ with $f(x,y)<0$ in the vicinity of some known starting point $p_0=(x_0, y_0)$ such that $f(x_0,y_0)<0$. Really I want to find the minimum value of $x_0$ that is continuously connected to $p_0$ by a path with $f(x,y)<0$. In practice I am happy if I can find any nearby locally minimal value of $y$ given some starting point. It is okay if we can't guarantee it is the minimal value of $y$ connected to the starting point by a negative path. (That would be asking too much.)

I have had some success with a constrained optimization algorithm based on BFGS where a quadratic model is used to search for $f(x,y)=0$ in the direction that is to be minimized. However, this algorithm fails when the (local) hessian has negative eigenvalues. This is not surprising since the BFGS quadratic model is only allowed to have positive eigenvalues. (Instead of eigenvalues becoming negative the BFGS updates tend to 0.)

What would be a good algorithm to search for a minimal value of $y$? (For simplicity I consider a two dimensional example above but really I am more interesting in 3 to 5 dimensional searches.)

What if I know that the hessian is positive definite in some subspace, for example in all directions except the one that I am trying to minimize?

P.S.: The function $f(p)$ is computationally expensive to compute, derivatives at the point $p$ are cheap to compute (if $f(p)$ has been computed).