Hi there I am reviewing for a final on convex optimization and encountered a problem that I am not sure about.
Consider a convex optimization problem: $$\min f_0(x)$$ $$\text{subject to }f_i(x)\le0, i=1,...,m, Ax=b,$$ that satisfies Slater's constraint qualification.
Suppose $x^*$ is optimal, with $f_1(x^*)=-0.2$. Is true then that for every dual optimal point $(\lambda^*,\nu^*)$, we have $\lambda_1^*=0$?
This question is from the Stanford open course: http://web.stanford.edu/class/ee364a/quizzes/duality.html
I don't understand why it is true. Is it because we have $f_1(x^*)=-0.2$ instead of $f_1(x^*)\le-0.2$? How do we know the constraint does not constrain the minimum?
The Lagrange multipliers associated to inequalities of the form $f_i(x) \leq 0$ satisfy complementary slackness:
$$ \lambda_i f_i(x^{\star}) = 0, \; \forall i = 1, \dots, m, $$
and in your case you have $f_i(x^{\star}) = -0.2 \neq 0$. Therefore, the only way for the product $\lambda_i f_i(x^{\star})$ to be equal to $0$ is when $\lambda_i = 0$. If you knew that $f_i(x^{\star}) = 0$, only then could $\lambda_i$ be nonzero - but it would depend on the rest of the KKT conditions.