Using KKT to minimize projection $f(x_1, x_2,..., x_n) = x_n$

146 Views Asked by At

For fixed, finite $n$, let $x \in \mathbb{R}^n$. I am trying to solve a problem of the form:

Minimize: $$f(x) = x_n$$ such that $$g_j(x) \leq 0; \ j \in \{1,...,m\}$$ $$h_\ell(x)=0; \ \ell \in \{1,..., r\}$$

What I tried was to use KKT conditions but I have a problem. A minimum is a point where $\nabla_x f = 0$, and, since I am working with the projection $f(x) = x_n$, the gradient is $\nabla_xf=(0,0,...,1)^T$. Does that mean I can't use this method? If so, what can I try?

1

There are 1 best solutions below

0
On

If you are trying to compute a global minimum, you need to have a convex cost function and a convex feasible domain (with a non null interior), thus your function g must be convex and h must verify certain assumptions (affine is suffcient). In such a case, the KKT conditions are expressed as :

1- gradient(Lagrangian(x,y)) = 0 ; where y represents the dual vairables

2- g(x)<=0 ; h(x)=0 primal feasibility

3- y>=0 dual feasibility

4- = 0 complementarity slackness

=> The lagrangian integrates the cost function f, the verification of the constraints (g(x)<=0 and h(x)=0)

=> The complementarity slackness is the reformulation of what @Cesareo said in his comment, if you are inside your domain the dual variables are null. If you are in the frontier, they take non null values thus your gradient is not necessarily null.