Constrained optimization with constraint qualification conditions

127 Views Asked by At

Let $X$ be a normed linear space, and let $G$ be a Frechet differentiable mapping defined on $X$. A point $x_0$ is said to satisfy the constraint qualification relative to the inequality $G(x)\leq\theta$ if $G(x_0)\leq\theta$ and if for every $h\in X$ satisfying $G(x_0)+G'(x_0)h\leq\theta$ there is a differentiable arc $x(t)$ defined for $t\in[0,1]$ such that (i) $G(x(t)\leq\theta$ for all $t\in[0,1]$, (ii) $\frac{dx(t)}{t}|_{t=0}=h$, (iii) $x(0)=x_0$ Suppose $x_0$ minimizes $f$ subject to the constraint $G(x)\leq\theta$ and that $x_0$ satisfies the constraint qualification. How to show that $f'(x_0)h\geq0$ for every $h$ satisfying $G(x_0)+G'(x_0)h\leq\theta$, and how to use this result to proove a Lagrange multiplier theorem for finite dimensional spaces?

1

There are 1 best solutions below

0
On

Major edit due to an error

Your condition is actually a constraint qualification and is exactly Kuhn-Tucker constraint qualification, see definition here. At fist time I saw that question, for some reason I was not understanding how to obtain the polar cones entirely, but it's not needed to see it because the Tangent Cone and the Linearized feasible direction set are the same, which is exactly Abadie's constraint qualification, see ACQ or ACQ. Hence, it is a constraint qualification. This is how to show this is a constraint qualification in finite dimension. In infinite dimension, it's similar.


Old and wrong answer:

I don't get your question entirely. What is "the constraint qualification"? Do you have some reference? Or do you invented that? If you invented, I would suggest to change the name of the condition, because it is not even a constraint qualification. Your a condition is too weak to be a constraint qualification. It's not hard to see that, considering the function $g:\mathbb{R}\rightarrow\mathbb{R}$ such that $g(t) = \text{max}(0,t)^2$, the feasible set $\{t \in \mathbb{R} : g(t)\leq 0\}$ satisfy your conditions, but the origin does not satisfy the KKT conditions unless the gradient of the objective function is zero. In general, constraint qualifications depends on the algebraic description the feasible set and the cone generated by the active constraints. By the way, your conditions looks like some kind of Clarke regularity, but I am not sure.

In order to your condition be a constraint qualification, the condition must be strengthened to:

fixed the feasible set $\{ x : G(x) \leq\theta \}$, for every objective function $f$, if $G(x_0)\leq\theta$ and if for every $h\in X$ satisfying $G(x_0)+G'(x_0)h\leq\theta$ there is a differentiable arc $x(t)$ defined for $t\in[0,1]$ such that (i) $G(x(t))\leq\theta$ for all $t\in[0,1]$, (ii) $\frac{dx(t)}{t}|_{t=0}=h$, (iii) $x(0)=x_0$, then the KKT conditions hold.

At the corrent point in optimization, I would believe that reformulated conditions are equivalent to some existing other. I would say that the reformulated conditions are the Kuhn-Tucker constraint qualifications (please, does not confuse with the KKT conditions itself; It is actually a constraint qualification).