How to minimize $ x $ subject to $ y \le x^3$ and $y \ge 0 $

102 Views Asked by At

I have been getting into NLP, the Karush Kuhn Tucker theorem and the Linear Independence Constraint Qualification and I came across this problem.

My first attempt was to solve graphically and I found the feasible solution to be $ (x,y) = (0,0) $, but I think the KKT theorem and the LICQ conditions fail at this point, can anybody help me on how to check if the conditions hold? Thanks!

1

There are 1 best solutions below

2
On BEST ANSWER

Let $$ \begin{align} h_1(x, y) &= x^3 - y \\ h_2(x, y) &= y \end{align} $$ Such that we can state our problem as $$ \underset{x, y}{\arg \min} \, x \\ \text{subject to } \begin{array}{c} h_1(x, y) \geq 0 \\ h_2(x, y) \geq 0 \end{array} $$ We identified the optimal point as $(0, 0)$ and inspec the gradients $\nabla h_1$, $\nabla h_2$ at that point $$ \begin{align} \nabla h_1(x, y) &= (3x^2, -1) \\ \nabla h_2(x, y) &= (0, 1) \end{align} $$ which inserted for $(0, 0)$ yields $$ \begin{align} \nabla h_1(0, 0) &= (0, -1) \\ \nabla h_2(0, 0) &= (0, 1) \end{align} $$ which are not linearly independent. Now both constraints are active $h_1(0, 0)= 0$ and $h_2(0, 0) = 0$ but their gradients are not linearly independent and LICQ does not hold.