Optimal basic feasible solution in which reduced cost vector has a negative component

109 Views Asked by At

I am stuck on the following question. I am mainly having a hard time coming up with an illustrative example. The question is as follows: Consider the linear program $(P): \min\limits_x \{ c^T x \; : \; Ax = b, x\geq 0 \}$, where $A$ has full rank. If $x$ is an optimal basic feasible solution for $(P)$ with an associated basis matrix $B$, and $\bar{c}$ is the corresponding vector of reduced costs, then is it always true that $\bar{c} \geq 0$ ?

I would assume that the answer is no, but I am having trouble coming up with a counterexample. Any help would be appreciated.

1

There are 1 best solutions below

0
On

It can easily be seen that $\bar{c} \in \mathbb R$ where $\mathbb R = \{\mathbb R_-,0,\mathbb R_+\}$. For any set $S \subset \mathbb N$ let define notation $c_S$ such that vector $c_S$ contains elements of $c$ whose indices form set $S$.
Lets define $I = \{i:c_i\gt0\}$ and $J = \{i:c_i\lt0\} $, $\forall i\in \mathbb N$ lets define $\hat {C_i}$ as the set of upper bounds of all constraints for which $x_i$ is part divided by its respective coefficient in the constraint, and $u_i=\min {\hat {C_i}}$ as the minimum element in the set. and lets define logistic regression function $F(x):\approx\frac 1{1+e^{O(-x)}}$ it can easily be proven that

$$ F(\sum_{i \in I} c_i - \sum_{i \in J} c_i)F(\sum_{i \in I}u_i-\sum_{i \in J}u_i)$$ approximates the probability that the objective value will be negative on minimization. Using this probability distribution you may tweak the constraints and objective function so as to get a higher probability that the objective value will be negative.