Individually checking constraints for convexity in Optimisation problem valid?

222 Views Asked by At

I have a quadratic minimisation problem where both the objective fn and constraints have some quadratic terms. (Such as a throttle variable (continous) * On/Off (integer variable)).

My question is: can I check my problem piecewise for convexivity?

So if I analyse each on of my constraints independently and find that they're all linear/convex then the sum of them interacting will also be convex?

I'm trying to figure out if I should be approaching this problem with LP-based techniques (i.e. Gurobi etc.) which I understand is best done on Convex problems, or Evolutionary based techniques (i.e. PSO, GA etc.) if it isn't convex.

Edit/Clarification: I'm aware that including integer terms means it's non-smooth which results in non-convexity (Or at least - I think that's correct). However LP solvers relax the Integer constraints into continuous ones so I don't think it affects the solution quality as much as it getting trapped in a local minima.

1

There are 1 best solutions below

6
On BEST ANSWER

If you have constraints where you turn on/off things by multiplying continuous variables with binary variables, you are typically not going to end up with convex (as in convex relaxations etc) models.

Instead, you should model this using big-M strategies. For instance, if you want to model $xy\geq 1$ where $x$ is continuous and $y$ is binary, you introduce a new variable $w$ which represents the product, and you can see the product as a logic condition (if $y=0$ then $w=0$ else $w=x$. This can be written using the big-M constraint $-M(1-y) \leq x-w \leq M(1-y), -My\leq x\leq My$. $M$ is a constant large enough to not restrict $x$ when $y$ is $1$ (for instance, an upper bound on $x$ in any solution). The smaller $M$ you can derive, the better from a numerical point of view.