so I was trying to do a very basic convex optimization example using the method of Lagrange multipliers. So I wanted to: $$ \min f_{0}(x)=x^2 $$ $$ ensuring\space f_{1}(x) = x - 2 \leq 0 $$ So I wrote the equation... $$ L(x,\lambda) = f_{0}(x)-\lambda f_{1}(x)=x^2-\lambda(x-2)=x^2-\lambda x +\lambda2 $$ And then took the gradient... $$ \bigtriangledown_{x,\lambda}L(x,\lambda)=\Big(\frac{\partial L}{\partial x},\frac{\partial L}{\partial \lambda}\Big) = (0,0) $$ ...to get the two equations... $$ 2x-\lambda=0 \space and \space -x+2=0 $$ Which only seems to indicated that the solution is $$ x=2 $$
But this is obviously wrong. So did I make an egregious mistake or does the method only work for more than one dimensional space or ....?
We have $$2x-\lambda = 0$$ but we do not have $-x+2=0$ in this case because the constraint need not be active.
The right constraint is $\lambda \ge 0$ and the complementary slackness condition.
$$\lambda(-x+2)=0.$$
If $\lambda=0$, then $x=0$. (The constraint is not active).
If $\lambda \ne 0$, then $x=2$. (The constraint is active).