normalization of constraints $ 0 \leq x \leq 1 $ in Lagrangian KKT

1k Views Asked by At

With Lagrangian we have an objective function and a set of equality constraints of form $ g_{i}(x_{j}) = 0 $ . With KKT we can have another set of inequality constraints of the form $ h_{i}(x_{j}) \leq 0 $. All constraints has to be normalized to fit into KKT. I don't know is there a better term that I should use instead of normalization. Is there any ?

What happens in cases where variables are like $ 0 \leq x_{j} \leq 1$ and $x_{j} \in Z $ $\forall\; j$ which is very common in optimization problems. I suppose I need to transform it to $ h_{i}(x_{j}) \leq 0 $. So I broke it into two constraints like the following and add it to set of constraints

$$ -x_{j} \leq 0 $$ $$ x_{j} -1 \leq 0 $$

However when solving the same with KKT I need to use form a lagrangian with slacks $ \mu_{j} $ and form a lagrangian with it. Am I right this far ?

$$ \mu_{1}(-x_{j}) $$ $$ \mu_{2}(x_{j} -1) $$

Problem occurs when I compute the partial derivatives w.r.t. $\mu_{1}$ and $\mu_{2}$. I get two equations

$$ x_{j} = 0 $$ $$ x_{j} -1 = 0 $$

Now both of these equations contradicts each other leaving the entire system of equations that was computed through partial derivatives in an inconsistent state.

So how will $0 \leq x_{j} \leq 1$ be fitted into a problem that I intend to solve with KKT ?

is $x^{2} - x = 0$ a better constraint to describe the same for KKT ? Then also I'll get x = 0.5 after partial derivative

2

There are 2 best solutions below

7
On BEST ANSWER

The partial derivatives in the KKT system are with respect to the primal variables (i.e. $x$) and not to the dual variables.

So if $\lambda_j$ and $\mu_j$ are the dual variables (or multipliers) respectively corresponding to the constraints $-x_j \le 0$ and $x_j - 1 \le 0$, you will have $-\lambda_j x_j + \mu_j (x_j -1)$ in the Lagrangian, and thus the derivative of the Lagrangian wrt $x_j$ will contain $-\lambda_j + \mu_j$.

An alternative way to express $0\le x_j \le 1$ is $x_j(x_j-1) \le 0$.

0
On

You only have equality for equality-constraints. The inequality constraints should be satisfied by, well, inequality.

Consider the problem $$\min\limits_{x\in \mathbb{R}^n} f(x)$$ Subject to \begin{align} c_i(x) &= 0, & &i\in\mathcal{E}\\ c_i(x) &\geq 0, & &i\in\mathcal{I}. \end{align} Define the Lagrangian $$\mathcal{L}(x,\lambda) = f(x) - \sum\limits_{i\in\mathcal{E}\cup\mathcal{I}}\lambda_i c_i(x).$$ Then necessary conditions (the famous KKT conditions) for a local minimizer $x^*$ are \begin{align} \nabla_x\mathcal{L}(x^*,\lambda^*) &= 0,\\ c_i(x^*) &= 0, & &\forall i\in\mathcal{E}\\ c_i(x^*) &\geq 0, & &\forall i\in\mathcal{I}\\ \lambda_i^* &\geq 0, & &\forall i \in\mathcal{I}\\ \lambda_i^*c_i(x^*) &= 0, & &\forall i\in\mathcal{E}\cup\mathcal{I}. \end{align}

Can you see how these would be applied to your problem?