In elementary methods in analytic number theory by Gelfond and Linnik, the claim is made that if $d(A) + d(B) > 1$, then we can find $A',B'$ where $A' \subseteq A$ and $B' \subseteq B$ such that $d(A') + d(B')$ is as close to $1$ as we want.
I am not clear why this would be true.
Here are the assumptions:
$A,B$ are infinite sequences of integers starting with $0$ with in sequential order such as $0, a_1, a_2, \cdots$ where $0 < a_1 < a_2 < \cdots$
Shnirelman density is defined as: $$d(A) = \inf\limits_{n}\frac{A(n)}{n}$$
where: $$A(n) = \sum\limits_{0<a_i\le{n}}{1}$$
So, it is clear that: $$0 \le \frac{A(n)}{n} \le 1$$
I would appreciate it if someone could explain why we can make the assumption that $d(A') + d(B')$ can be as close to $1$ as we wish.
I think that I've figured out the reasoning here.
Here's my thinking:
(1) We can assume that $d(B) \le d(A) < 1$.
Note: If $d(A)=1$, then we set $B' = B - \{1\}$ to get $d(A)+d(B')=1$
(2) Let $\epsilon$ be any number such that $1 > \epsilon > 0$
(3) We can assume that $1 - d(A) > \epsilon$
Note: If not, then we set $B' = B - \{1\}$ to get $\epsilon > 1 - d(A)-d(B')$
(4) There exists integers $x,y$ such that $1 - d(A) - \epsilon \le \frac{x}{y} < 1 - d(A)$
(5) From the definition of density, we know that $d(B) \le \frac{B(y)}{y}$
(6) Using $y$, we can build $B'$ by removing $B(y)-x$ elements from $B$.
Note: $\frac{x}{y} < d(B) \le \frac{B(y)}{y}$ since $d(A) + d(B) > 1$ but $d(A) + \frac{x}{y} < 1$.
Please let me know if you can state the argument more concisely or if you see a problem in my reasoning.