Construction of a linear programming given a solution

87 Views Asked by At

suppose given a solution $(x*, y*)$ of the nonlinear programming solution given below, I try to infer some conditions on the functions involved in the problem.

The NLP I would like to solve has the following form:

$min_{x,y} y$ st. $x, y$ are real number with $y\ge0$

$-y\le h_1(x)\le y$

$-y\le h_2(x)\le y$

which could be reformulated as:

$min_{x,y} y$ st.

$y-h_1(x) \ge 0$

$h_1(x)+y \ge 0$

$y-h_2(x) \ge 0$

$h_2(x)+y \ge 0$

Suppose given a pair $(x,y)$ for which the first and the last constraints are active. I am going to express first order necessary condition for optimality (I am following the book "Numerical Optimization" of Nocedal et.al.:

The matrix that defines the gradients of the constraints at the solution is given below:

$A=\begin{bmatrix}-h'_1(x*)&h'_2(x*)\\1&1\end{bmatrix}$

And suppose that I have chosen the pair $(x*,y*)$ in a way that A is full rank, so the Linearly Independent Constraint Qualification (LICQ) holds.

Expressing the other optimality conditions, I found that I can find values of $\lambda_1$ and $\lambda_2$ that verify all the conditions (Primal feasibility, dual feasibility, the gradient of the Lagrangian is zero, the complementarity conditions...).

Then I came to the second order conditions, In the book it says that if first order optimality holds and that for all $w$ different from zero that belongs to the null-space of A, the quantity $w^t(∇^2L)w $ must be strictly positive where $L$ is the Lagrangian of the problem. Since The null space is reduced to zero, I supposed that the second order sufficient conditions hold.

Here I can to the problem, When I draw the objective function of a range of value of $x$ I find that the optimal is not attained at (x*, y*) I previously set in the beginning. Could any one help me please figure out what is the mistake I made in this reasoning.

Thank you in advance. If something is not clear please pardon my lack of explanation.