Second order conditions in linear programming problems?

141 Views Asked by At

I know, that in linear programming problems, in order to find maxima or minima, you search for corner solutions of a polyhedron. So, there is a system of linear equations, that you are trying to solve with a specific process, namely the simplex method or KKT method. But why should someone search for second order conditions. Since linear programming is not a field that I have seen so much, I have in mind that second order conditions should be zero. Is there any chance that this is different than zero?

1

There are 1 best solutions below

6
On BEST ANSWER

Actually, second order conditions seem to be satisfied in most of the cases. Let us take the simple example of optimizing $$f(x) = \alpha x$$ over $x \in [0,1]$. Here, "most of the cases" means $\alpha \ne 0$ and w.l.o.g. we discuss $\alpha > 0$. Then, $\bar x = 0$ is the point of interest. In this point, the first-order condition is $$ f'(\bar x) h \ge 0 \qquad \forall h \ge 0$$ and this is satisfied due to $f'(\bar x) = \alpha > 0$. The second-order conditions involve the critical cone $$ C := \{ h \ge 0 \mid f'(\bar x) h = 0 \}. $$ The necessary condition is $$ h^\top f''(\bar x) h \ge 0 \qquad \forall h \in C $$ (with the Hessian "matrix" $f''(\bar x)$) and the sufficient condition is $$ h^\top f''(\bar x) h > 0 \qquad \forall h \in C \setminus \{0\}. $$ Note that both conditions are (trivially) satisfied due to $C = \{0\}$.

(However, much more is true: Even the first-order sufficient condition $$ f'(\bar x) h > 0 \qquad \forall h > 0 $$ is satisfied.)