In convex analysis optimization, I would like to show that the necessary conditions of the multiplier rule correspond to the nonexistence of a decrease direction. I would like to prove the following theorem:
Theorem: Let $\{\zeta_i : i = 1,2,...,k \}$ finite subset in $X^{*}$ - dual space of the normed space $X$. The following are equivalent:
$(a)$ There is no $v \in X$ such that $\langle \zeta_i , v \rangle < 0 , \forall i = 1,...,k$;
$(b)$ The set $\{\zeta_i : i = 1,2,...,k \}$ is positively linearly dependent : there exists a nonzero nonnegative vector $\gamma \in \mathbb{R}^k$ such that $\sum_{1}^{k} \gamma_i \zeta_i = 0$.
$\quad$ To prove $(a) \implies (b)$, I will construct two subsets and I try to use the separation theorem. Before, I enunciate such theorem:
(Hahn-Banach separation theorem)
Let $K_1$ and $K_2$ be nonempty, disjoint convex subsets of the normed space $X$. If $K_1$ is open, then the sets can be separated. That is, there exist $\gamma$ ∈ $X^{*}$ and $\eta$ ∈ R such that $$\langle \gamma , x \rangle < \eta \leq \langle \gamma , y \rangle, \quad \forall x \in K_1 , y \in K_2.$$
To apply the separation theorem, let
$$K_1 = \{y \in \mathbb{R}^k : y_i < 0, \forall i\}, \quad K_2 = \{ (\langle \zeta_1 , v \rangle , ..., \zeta_k , v \rangle) : v \in X \}$$
The sets $K_1 , K_2$ are both convex and nonempty. By $(a)$ the sets are disjoint. Then, by the separation theorem, exist $\eta \in \mathbb{R}$ and $\gamma \in \mathbb{R}^k$. It is straightforward show that $\eta = 0$ and
$$ 0 \leq \sum \gamma_i \langle \zeta_i , v \rangle, \quad \forall v \in X.$$
I can't show the nonnegative of $\gamma$ and the other inequality
$$ 0 \geq \sum \gamma_i \langle \zeta_i , v \rangle, \quad \forall v \in X.$$
Some help?
Hints: