In a proof, I encountered the following situation:
For every $n \in \mathbb{N}$ between $k^2 \leq n \leq (k+1)^2$, we have:
$$a_k \leq b_n \le c_{k+1}$$
where $a_k, c_k \to 0$ if $k \to \infty$.
They then conclude that $\lim_{n \to \infty} b_n = 0$. I can see this intuitively but can't write it out rigorously.
Define $A_n$ and $C_n$ as follows: if $k^2\leq n<(k+1)^2$, let $A_n=a_k$ and $C_n=c_{k+1}$. Then, $A_n\leq b_n\leq C_n$ so it suffices to show $A_n\to 0$ and $C_n\to 0$ and use the Sandwich Theorem.
For example, with $A_n$: let $e>0$ be given. As $a_k\to 0$, there is $K$ such that $|a_k|<e$ for all $k\geq K$. Then, for all $n\geq K^2$, we have $A_n=a_{k}$ for some $k\geq K$ and so $|A_n|<e$.