Consider the set $(s_1, ..., s_N) \in S$, where all $s_i$ are positive integers selected from some interval $[M, L]$ and the sum of any $k$ integers in $S$ is required to be unique and to have a distance of at least $d$ from all other possible sums of $k$ integers in the set. For example, if $k = 2$, the sum of any two randomly chosen elements, i.e. $s_a$ and $s_b$, must generate a unique sum that has a difference of at least $d$ from all other possible sums of two elements in $S$.
As a function of $k$, the minimum difference between sums $d$, and the size of the interval, $(L - M)$, what is the largest possible size of the set $S$, and what is the most efficient way to compute its elements?
Edit - I'm of course happy to hear any $d = 1$ solutions.
Edit 2 - For the $d=1$ case, is this problem NP-complete, similar to the subset-sum problem?
Here's a start. The number of choices of $k$ elements is $N\choose k$ (or a little more, if repeats are allowed), and each sum is between $kM$ and $kL$, so you must have $${N\choose k}\lt kL-kM+2$$ which gives you an upper bound on $N$. It doesn't tell you whether you can achieve that bound, nor an efficient way to try.
EDIT: Ah, you've changed the problem, so my answer is now relevant only to your case $d=1$. But if you divide the right side of the displayed inequality by $d$, it should apply to your more general question.