Let $J = \{j_n\}_{n=1}^N$ be a finite set of non-negative real numbers, and $ \Delta j \geq 0$. I want to find the smallest subset $J_s =\{j_i\}_{i=1}^M \subset J$ such that for every $j_k \in J$ there exists a $ j_l \in J_s $ such that $ |j_k - j_l| \leq \Delta j$.
Example: Let $J = \{1,2,3,5,7,8,12\}$ and take $\Delta j = 1.2$. Then $J_s = \{2,5,7,12\}$ solves the problem. (Also $J_s = \{2,5,8,12\}$. )
Does anyone here know on how to solve this, or any equivalent problems? All help is much appreciated.
First, sort all the elements of $J$; this can be done quickly in $\mathcal O(n\log n)$ time. Reindex the elements of $J$ so that $j_1 \leq j_2 \leq \dots \leq j_N$.
Next, for each $j \in J$ define $S_j = \{ j' : |j-j'| \leq \Delta j\}$. We can think of $S_j$ as being the set of elements that each $j$ covers. We then pick $J_s$ in the following manner: first we find the rightmost $j$ such that $S_j$ covers the leftmost element in $J$. This $j$ will be included in $J_s$. We then look at the element that follows the last element in $S_j$ (call it $j'$) and repeat the process as if $j'$ were the leftmost element in $J$. Keep repeating this process until all the elements of $J$ are covered.
This webpage has a more thorough pseudocode if you're interested: http://www.cs.yorku.ca/~andy/courses/3101/lecture-notes/IntervalCover.html