I'm trying to find an algorithm for the following problem. Let $G$ be a bipartite graph. The edges in $G$ have labels $R$; each label $R(u, v)$ is an integer range $[a, b]$ with $a$ and $b$ being nonnegative integers with $a \le b$. The problem is to assign each node in the graph with nonnegative integer labels $L$ such that
- If $(u, v) \in G$ then $L(v) - L(u) \in R(u, v)$
- The sum of $L(v) - L(u)$ for all $(u, v) \in G$ is minimized
Are there any similar problems that can read about to give me some insight?
Thanks Mike, for some reason I hadn't thought to formulate it as a linear program. But once you do that, the answer is obvious. In fact, when you set up the linear program, the resulting matrix is totally unimodular. Which means it can be solved quite easily by any linear solver and the result will be an integer solution.