Efficient algorithm for determining whether value of convex optimization program is below some value?

131 Views Asked by At

Let $X$ be a convex subset of $\mathbb{R}^N$, let $c \in \mathbb{R}^N$.

I want to know whether

$$ \min_{x \in X} x^\top c < 0. $$

Obviously, I can (efficiently, with standard software) evaluate the program, then compare its value to zero, but it's easy to see how this is wasteful.

My question is: is there a standard algorithm that works with the composed problem directly and is much more efficient as a result (A naïve method would be to just check each iteration of some sort of interior point method then terminate if it goes below zero -- is there something better)?

2

There are 2 best solutions below

0
On

Your problem is equivalent to solve: $$ \min \delta_X(x)+c^Tx$$ and determine whether the minimal value is less than 0. If the projection onto set $X$ is easy to solve, the standard method is Douglas-Rachford splitting. Specifically, its iterations are given as: $$\left\{ \begin{aligned} y_{k+1} & =x_{k+1}-\gamma c \\ z_{k+1} & =\text{proj}_{X}(2y_{k+1}-x_{k+1}) \\ x_{k+1}&=x_k+y_{k+1}-z_{k+1} \end{aligned} \right. $$ where $\gamma>0$ is a parameter. It is proved that $y_k,z_k$ will converge to the optimal solution.

0
On

If the dimension of your problem is not too large I suggest using a primal-dual interior point method (see https://en.wikipedia.org/wiki/Interior-point_method) - once the value at the primal variables goes below 0 or the value at dual variables goes above 0 you can stop.