Suppose $c < d$. Prove there is a $q \in Q$ so that $|q-\sqrt{2}| < d - c$
I began this proof by setting $c=1, d=2$, and consequently, $q=\sqrt{2}$. However, my professor informed me that proofs need to satisfy general cases, which makes sense.
Then, I created the inequality $-\alpha<q-\sqrt{2}<\alpha$ with $\alpha$ being arbitrary. Because $\sqrt{2}$ is irrational, $i=q-\sqrt{2}$ must be, so we have $-\alpha<i<\alpha$. This process seems typical of similar problems, but I do not know how to complete the proof. I appreciate any and all help!
Hint: Consider the number $x(n)$ represented by the first $n$ digits in the decimal expansion of $\sqrt{2}$. For example, $x(1) = 1$, $x(2) = 1.4$, $x(3) = 1.41$ and so on. Each $x(n)$ is rational since its decimal expansion is finite, and we can see that as $n$ becomes large, $x(n)$ becomes a better and better approximation of $\sqrt{2}$.
To analyze how good these approximations are, note the following inequalities: $$\begin{align*}\sqrt{2} - x(1) = 0.4142\dots & < 1, \\ \sqrt{2} - x(2) = 0.0142\dots & < 1/10, \\ \sqrt{2} - x(3) = 0.0042\dots & < 1/10^2, \end{align*}$$ and so on. Try using this pattern (and be sure to prove it really is a pattern) to show that with sufficiently large $n$, the approximation gets as good as you like (without being equal to $\sqrt{2}$).