With Lagrangian we have an objective function and a set of equality constraints of form $ g_{i}(x_{j}) = 0 $ . With KKT we can have another set of inequality constraints of the form $ h_{i}(x_{j}) \leq 0 $. All constraints has to be normalized to fit into KKT. I don't know is there a better term that I should use instead of normalization. Is there any ?
What happens in cases where variables are like $ 0 \leq x_{j} \leq 1$ and $x_{j} \in Z $ $\forall\; j$ which is very common in optimization problems. I suppose I need to transform it to $ h_{i}(x_{j}) \leq 0 $. So I broke it into two constraints like the following and add it to set of constraints
$$ -x_{j} \leq 0 $$ $$ x_{j} -1 \leq 0 $$
However when solving the same with KKT I need to use form a lagrangian with slacks $ \mu_{j} $ and form a lagrangian with it. Am I right this far ?
$$ \mu_{1}(-x_{j}) $$ $$ \mu_{2}(x_{j} -1) $$
Problem occurs when I compute the partial derivatives w.r.t. $\mu_{1}$ and $\mu_{2}$. I get two equations
$$ x_{j} = 0 $$ $$ x_{j} -1 = 0 $$
Now both of these equations contradicts each other leaving the entire system of equations that was computed through partial derivatives in an inconsistent state.
So how will $0 \leq x_{j} \leq 1$ be fitted into a problem that I intend to solve with KKT ?
is $x^{2} - x = 0$ a better constraint to describe the same for KKT ? Then also I'll get x = 0.5 after partial derivative
The partial derivatives in the KKT system are with respect to the primal variables (i.e. $x$) and not to the dual variables.
So if $\lambda_j$ and $\mu_j$ are the dual variables (or multipliers) respectively corresponding to the constraints $-x_j \le 0$ and $x_j - 1 \le 0$, you will have $-\lambda_j x_j + \mu_j (x_j -1)$ in the Lagrangian, and thus the derivative of the Lagrangian wrt $x_j$ will contain $-\lambda_j + \mu_j$.
An alternative way to express $0\le x_j \le 1$ is $x_j(x_j-1) \le 0$.