algorithm to find optimal vertices in convex optimization

197 Views Asked by At

Given $$\min x^TQx \\ \text {s.t} Ax=b, x\ {>=}\ 0$$ where $Q \not = 0$ and $Q$ is positive semidefinite.

Is there an algorithm that guarantees a vertex solution or at least prefers a vertex solution if exists. I observed that a version of Frank Wolfe's Algorithms with simplex might be useful but there is no proof of that. I need a suggestion of an algorithm that does this (however, not brute force). The answer below doesn't consider the non negative constraint on $x$. solution $y$ is guaranteed to have an optimal vertex solution(s).

Edit:

the vertices are on real numbers. This isn't integer programming, but might be helpful in integer programming where $A$ is totally unimodular, but it's LCP form isn't even if for all element $e \in Q$ is in $\{-1,0,1\}$.
1

There are 1 best solutions below

1
On

$ \def\l{\lambda} \def\LR#1{\left(#1\right)} \def\BR#1{\Big(#1\Big)} \def\trace#1{\operatorname{Tr}\LR{#1}} \def\qiq{\quad\implies\quad} \def\p{\partial} \def\grad#1#2{\frac{\p #1}{\p #2}} \def\a#1{\color{green}{#1}} \def\b#1{\color{blue}{#1}} \def\c#1{\color{red}{#1}} \def\CLR#1{\c{\LR{#1}}} \def\fracLR#1#2{\LR{\frac{#1}{#2}}} $The beauty of a linear constraint is that it can be explicitly solved $$\eqalign{ Ax = b \qiq x &= \b{A^+b} \;+\; \CLR{I-A^+A}u \\ &= \b{x_0} \;+\; \c{P}u \\ }$$ where $u$ is a new unconstrained vector variable, $A^+$ is the pseudoinverse, and $P$ is an orthoprojector into the nullspace of $A$.

Substituting this into the objective function yields $$\eqalign{ \l &= x^TQx \\ &= {\LR{Pu+x_0}^TQ\LR{Pu+x_0}} \\ &= u^TPQPu + 2x_0^TQPu + x_0^TQx_0 \\ }$$ Calculate the gradient wrt $u$ $$\eqalign{ d\l &= \LR{2u^TPQP + 2x_0^TQP}\:du \\ &= 2\LR{PQPu + PQx_0}^T\:du \\ \grad{\l}{u} &= 2\LR{PQPu + PQx_0} \\ }$$ Set the gradient to zero and solve for the optimal vector $$\eqalign{ (PQP)u &= -PQx_0 \\ u &= -\LR{PQP}^+PQx_0 \\ x &= x_0 \;+\; Pu \\ &= x_0 \;-\; P\LR{PQP}^+PQx_0 \\ &= \BR{I \,-\, P\LR{PQP}^+PQ}\,A^+b \\ }$$