Set of optimal directions is a cone

120 Views Asked by At

Let $\vec{u}$ be an optimal point for the linear programming problem: \begin{equation} \min{ \vec{c}^T \vec{x}}\\ \text{s.t. } A\vec{x}=\vec{b} \text{ and } \vec{x} \geq \vec{0} \end{equation}

where $A\in M_{m\times n}$, $\vec{c}\in\mathbb{R}^n$, $\vec{b}\in\mathbb{R}^m$.

We said that $\vec{d}$ is an optimal direction at $\vec{u}$ if there exist some $\varepsilon > 0 $ such that $\vec{u} + \varepsilon\vec{d}$ is also an optimal solution. Let $D\subset\mathbb{R}^n$ the set of optimal direction at $\vec{u}$. Prove that $D$ is a cone. That is, for any $\vec{x}_1, \vec{x}_2,\ldots\vec{x}_k \in D$, we must show that

$\{\alpha_1\vec{x}_1 +\alpha_2\vec{x}_2 + \cdots + \alpha_k\vec{x}_k \text{ | } \alpha_i\in\mathbb{R}\text{ , }\alpha_i\geq0\}\subseteq D.$

Note: It might help the fact that the set of optimal solutions of such LP problem is a convex set. Also, we may assume that an optimal value is always in the feasible set of constraints.

Proof Sketch

We must show that the conic combination of $\vec{x}_1, \vec{x}_2,\ldots\vec{x}_k \in D$ is an optimal direction at $\vec{u}$. That is, there exists some $\varepsilon>0$ such that $$\vec{u}+ \varepsilon (\sum_{i=1}^k \alpha_i \vec{x}_i)$$ is also an optimal. Since every $\vec{x}_i$ is an optimal direction at $\vec{u}$ then there exist some $\varepsilon_i>0$ such that $\vec{u}+\varepsilon_i\vec{x}_i$ is an optimal.

My struggle until this point is that i cannot see the conection between the $\varepsilon$ that I need to find and the other $\varepsilon_i$ that I already have.

1

There are 1 best solutions below

0
On BEST ANSWER

By induction, it suffices it suffices to show that if $\vec{x}_1, \vec{x}_2 \in D$ and $a_1, a_2\geq 0$ then $a_1\vec{x}_1+a_2\vec{x}_2 \in D$. If either $a_1$ or $a_2$ is $0$ then the problem is trivial. We therefore suppose without loss of generality that $a_1, a_2>0$.

To prove this, we let $\varepsilon_1>0$ and $\varepsilon_2>0$ be such that $\vec{u}+\varepsilon_1\vec{x}_1$ and $\vec{u}+\varepsilon_2\vec{x}_2$ are optimal points for your linear program. The fact that such numbers $\varepsilon_1$ and $\varepsilon_2$ exists follows from the assumption that $\vec{x}_1, \vec{x}_2\in D$.

Since the collection of optimal solutions for the linear program forms a convex set, we know that for any $\lambda \in [0,1]$, $$ \lambda\left(\vec{u}+\varepsilon_1\vec{x}_1\right) + (1-\lambda)\left(\vec{u}+\varepsilon_2\vec{x}_2\right) = \vec{u} + \lambda\varepsilon_1 \vec{x}_1 + (1-\lambda)\varepsilon_2\vec{x}_2 $$ is also an optimal point. Therefore, in order to prove that $a_1\vec{x}_1+a_2\vec{x}_2 \in D$, it suffices to find $\lambda \in [0,1]$ and some $\varepsilon > 0$, $$ \vec{u} + \varepsilon \left(a_1\vec{x}_1+a_2\vec{x}_2\right) =\vec{u} + \lambda\varepsilon_1 \vec{x}_1 + (1-\lambda)\varepsilon_2\vec{x}_2. $$ We thus seek $\lambda\in [0,1]$ and $\varepsilon>0$ such that $$ \varepsilon a_1 = \lambda \varepsilon_1,\quad \varepsilon a_2 = (1-\lambda) \varepsilon_2. $$ After solving for $\varepsilon$ and $\lambda$, we are done.