In studying for an upcoming exam the following problem came up:
Write an integer program to: minimize the maximum Lateness for the one machine scheduling problem with precedence constraints and processing times of jobs all equal to 1 i.e. 1
where we use the variable 2
I know this is a variant of the famous NP hard 1|rj|Lmax problem that we solve using branch and bound, but I'm a bit stuck on how to approach this one.
Also excuse any formatting issues, this is my first time actually posting in this great place,so bear with me! Thanks
Introduce a variable $z$ to represent the maximum lateness. The objective is to minimize $z$. Now you need three sets of linear constraints to enforce: