If $\{a_n\} $ converges to $a$ and $\{b_n\}$ converges to $b$, and we have that $a_n < b_n$ for all $n$, is it true that $a<b$?
I was thinking of some counterexample where the values sort of jump back and forth before they converge at some point, but eventually they need to "cluster" around two different points, and if they would not be ordered very far into the sequence, we would have a problem, so I think this is true. For a proof I would go back to the definition:
$ \forall \epsilon_0>0$ we know $\exists k_0 \in \mathbb{N}$ such that for all $k\geq k_0$, we have that $|a_k -a|< \epsilon_0$, and similarly $ \forall \epsilon_1>0$ we know $\exists k_1 \in \mathbb{N}$ such that for all $k\geq k_1$, we have that $|b_k -b|< \epsilon_1$.
For all $\epsilon>0$,
I suppose we want to pick a specific $\epsilon_0=\epsilon_1= \frac{\epsilon}{2}$. But now I am kind of stuck, I don't want to prove they are the same, I want to prove that the limits have a certain ordering.
No, $a_n<b_n$ does not imply that $a<b$. take for example $a_n=-\frac{1}{n}\to a=0$ and $b_n=\frac{1}{n}\to b=0$. Another example, $a_n=\frac{1}{2n+1}\to a=0$ and $b_n=\frac{1}{2n}\to b=0$