Let $A$ be a non-empty subset of $\mathbb{R}$. Define the difference set to be
$A_d := \{b-a\;|\;a,b \in A \text{ and } a < b \}$
If $A$ is infinite and bounded then $\inf{A_d} = 0$.
Since $a < b$ we have $b - a > 0$. Thus zero is a lower bound for $A_d$ and $\inf(A_d) \geq 0$. I then want to show that if $\inf(A_d) = \epsilon > 0$ and $A$ is bounded, then $A$ is finite.
Let $\inf(A) = \beta$ and $\sup(A) = \alpha$. Then there can be at most $\lfloor \frac{\alpha - \beta}{\epsilon} \rfloor$ real numbers in $A$. Suppose that there are greater than $\lfloor \frac{\alpha - \beta}{\epsilon} \rfloor + 1$ numbers in $A$. Since $b - a > \epsilon$ for each $a , b \in A$, We have $\alpha > (\lfloor \frac{\alpha - \beta}{\epsilon} \rfloor + 1)(\epsilon) + \beta$. However this is a contradiction, since $(\lfloor \frac{\alpha - \beta}{\epsilon} \rfloor + 1)(\epsilon) + \beta > (\lfloor \frac{\alpha - \beta}{\epsilon} \rfloor)(\epsilon) + \beta\geq ( \frac{\alpha - \beta}{\epsilon} )(\epsilon) + \beta = \alpha$. Thus the cardinality of $A$ must be less than or equal to $\lfloor \frac{\alpha - \beta}{\epsilon} \rfloor + 1$ and thus finite. We have show that if $\inf($A_d$) > 0$ and $A$ is bounded then, $A$ cannot be infinite.
One question I have is whether this would be enough to prove the theorem. I'm sure that there are more effecient ways to formulate the above argument. I feel like this is a good opportunity for the pigeon hole principle but I don't really know how to "invoke" it. Critique is welcomed and appreciated.
Your argument is ok for me. If you want to apply the Pigeon-Hole Principle: We have $A\subset [\inf A, \sup A]=[x,y]$ with $x<y$. For any $r>0$ take $n\in N$ such that $(y-x)/n<r.$ The set of $n$ intervals $S= \{[x+j(y-x)/n,x+(j+1)(y-x)/n] : 0\leq j<n\}$ covers $[x,y].$ Take any set $B$ of $n+1$ members of $A.$ At least two distinct $c,d\in B $ belong to the same member of $S.$ So $\exists c,d\in A\;(0<|c-d|\leq (y-x)/n<r).$