I am solving an optimization problem $$\min f(x_1,x_2)\\ \text{s.t.}~~~~ c(x_1,x_2)\leq 0$$ In my problem, there are two min and two max and I am looking for the global min.
I know that with the constraint $c(x_1,x_2)\leq 0$, there is only one point $(\hat{x}_1,\hat{x_2})$ in which $\nabla f(x_1,x_2)=0 $, and that point in my global min.
To solve the problem efficiently, I start with an initial point (using an algorithm to be sure that the initial point satisfies $c<0$) which is close to the solution and I use Newton's method to solve $\nabla f(x_1,x_2)=0 $. For the initial point, first I check the following constraints to be sure that the Newton's method converges to the min not max and escapes from a saddle point:
$$\begin{aligned} &1)~~~~~~~c(x_1,x_2)\leq 0\\ &2)~~~u^T \nabla^2f(x_1,x_2) u>0\\ &3)~~\lambda_{min}( \nabla^2f(x_1,x_2))>0 \end{aligned}$$ If the above constraints are not satisfied, I use the steepest descend algorithm to push the point in the area in which all the constraints are satisfied.
Here are my questions: what is the criteria for the initial point in which Newton's method converges to the min and not max (I mean not go far away because the initial point is close to min)? Is Newton's method is the most efficient method? Are the constraints correct?
I would appreciate your answers.