Let (1) $f:R^k \rightarrow R$ be a convex and differentiable function and (2) $g:R^K \rightarrow R$ be a concave and differentiable function. Consider the minimization problem $\{\min _x \ \ f(x) \ \ s.t. \ \ g(x)=0\}$ and let the point $(x^*)$ be a point that satisfies the Lagrange Conditions: $\nabla f(x^*) = \lambda ^* \nabla g(x^*)$.
My question is: is the fact that $f$ is convex and $g$ concave enough to state that $x^*$ is a local or global minimizer? Is there a theorem that I can use in this case?
Thanks.
It is neither sufficient for a local nor for a global minimum. Consider $f(x,y)=[x-(1/2)]^2+(y-2)^2$ and $g(x,y)=y-x^2/2$. The first-order optimality conditions are $$2x-1-\lambda x=0,\\2y-4+\lambda=0,\\y-x^2/2=0.$$ The latter two equations yield $\lambda=4-x^2$. Substitution into the first equation yields $$x^3-2x-1=0,$$ which has the solutions $x_1=-1$, $x_{2,3}=(1\pm\sqrt{5})/2$. Let us denote the corresponding $y$ values by $y_1$, $y_2$, and $y_3$, respectively. The point $(x_3,y_3)$ is the global minimum, but $(x_2,y_2)$ is neither a global nor a local minimum.