Let $X$ be a convex subset of $\mathbb{R}^N$, let $c \in \mathbb{R}^N$.
I want to know whether
$$ \min_{x \in X} x^\top c < 0. $$
Obviously, I can (efficiently, with standard software) evaluate the program, then compare its value to zero, but it's easy to see how this is wasteful.
My question is: is there a standard algorithm that works with the composed problem directly and is much more efficient as a result (A naïve method would be to just check each iteration of some sort of interior point method then terminate if it goes below zero -- is there something better)?
Your problem is equivalent to solve: $$ \min \delta_X(x)+c^Tx$$ and determine whether the minimal value is less than 0. If the projection onto set $X$ is easy to solve, the standard method is Douglas-Rachford splitting. Specifically, its iterations are given as: $$\left\{ \begin{aligned} y_{k+1} & =x_{k+1}-\gamma c \\ z_{k+1} & =\text{proj}_{X}(2y_{k+1}-x_{k+1}) \\ x_{k+1}&=x_k+y_{k+1}-z_{k+1} \end{aligned} \right. $$ where $\gamma>0$ is a parameter. It is proved that $y_k,z_k$ will converge to the optimal solution.