I recently came across this argument and I'm not completely sure I understand why it's true.
Say we have two numbers $F_j, F_i \in [0,1]$ such that $F_j \geq F_i + 2^{-l_i}$, where $l_i = \left\lceil \log_2{\frac{1}{p_i}}\right\rceil$, $p_i \in [0,1]$. The argument is that the binary expansion of $F_j$ will differ from the binary expansion of $F_i$ at least once in the first $l_i$ places.
This holds for the examples I've tried but I don't know how you'd go about proving it.