most available optimization solvers require a set of "AND" constraints, i.e. they will find and optimum within the region where all constraints are satisfied. On the contrary, sometimes it is needed to solve the problem with "OR" constraints, i.e. if one OR another constraint is satisfied. For example, I might want to optimize a function of single variable on a union of intervals $x\leqslant 1$ and $x \geqslant 2$, i.e. I need to find a minimum of the function when either $x\leqslant 1$ OR $x \geqslant 2$. For single variable such kind of restriction can be resolved by defining a polynomial constraint - in this specific case it would be $(x-1)(x-2)\geqslant 0$. I know that this inequality will restrict the optimizer to the exact area I need.
However, when going to higher dimensions it becomes tricky. For example, even for simplest case with two variables - if I need to find a solution where either $x\geqslant 0$ OR $y \geqslant 0$, i.e. any case except of both of them being negative. If I define $xy \leqslant 0$ it will always restrict one variable be positive and the second negative, so the case where both can be positive is absent.
My question, is there any general technique to do this (like in single-var case)?