I am implementing a mixed-integer linear programming problem, and I am dealing with an huge number of constraints. Does anyone know what the linear relation is between the number of constraints of the problem and the computation time ( for example, 1 million constraints costs around ?? hours).
Thanks in advance.
An integer program is in general NP-Complete, that is to say there is no polynomial time algorithm that can be used to solve it. So, in the most general case, without looking at your problem, one would have to say that there is no way to solve it naively in any reasonable time.
The actual computational complexity would depend on the problem structure. E.g., a knapsack problem is solvable in pseudopolynomial time i.e, it is polynomial in the magnitude of coefficients of the problem rather than the size of the problem[1]. So simply rescaling problem coefficients leads to reduction in computation time.
In short, it is all very problem specific.
Some problems might have structure that might simplify to a simpler problem. One could also look to methods like branch and bound, cutting plane, or decomposition or a combination of these to solve the problem. Literature related to Large Scale Optimization methods would be of help here.
[1]Papadimitriou, C. H. 1981. On the complexity of integer programming. J. ACM 28, 4, 765–768.