Given a set $V=\{v_1, ..., v_n\}$ of values and a set $S=\{s_1, ..., s_m\}$ of sums, partition $V$ into $\tilde{V_1}, ..., \tilde{V_m}$, such that $\sum_{v \in \tilde{V_k}} v = s_k + \epsilon_k$ with $\sum_{k=1}^{m} \left | \epsilon_k \right | = min$. The $\epsilon_k$ represent the error for each sum and can be positive or negative.
In my application, it is known that $\sum_{v \in V}{v} = \sum_{s \in S}{s}$, if that simplifies the problem.
I have been looking through a lot of combinatorial optimization problems, but can't find one that matches mine (or at least I am unable to infer which is equivalent).
I am looking for a name or any info for further reading or some insight into how to approach the problem.
You can solve this problem via mixed integer linear programming as follows. Let binary decision variable $x_{i,k}$ indicate whether value $v_i$ appears in partition $k$. Let $\epsilon^+_k \ge 0$ and $\epsilon^-_k \ge 0$ be slack variables. The problem is to minimize $\sum_{k=1}^m (\epsilon^+_k + \epsilon^-_k)$ subject to linear constraints: \begin{align} \sum_{k=1}^m x_{i,k} &= 1 &&\text{for $i\in \{1,\dots,n\}$}\\ \sum_{i=1}^n v_i x_{i,k} - \epsilon^+_k + \epsilon^-_k &= s_k &&\text{for $k\in \{1,\dots,m\}$} \end{align} To recover the partition from the resulting optimal solution $x^*$, take $$\tilde{V}_k=\{i\in\{1,\dots,n\}:x^*_{i,k}=1\}.$$