I wanted to prove the seemingly simple statement:
If $\lim \frac {a_n}{b_n}=1$ and $f$ continuous with $f(b_n)\neq0$ then $\lim \frac {f(a_n)}{f(b_n)}=1.$
I started promptly with
\begin{align} \\ \lim \frac {f(a_n)}{f(b_n)} &= \lim \frac { f(b_n \times \frac{a_n}{b_n})}{f(b_n)} \\ &= \frac { f(b_n \times \lim\frac{a_n}{b_n})}{f(b_n)} \\ &= \frac { f(b_n \times 1)}{f(b_n)} \\ &= 1 \end{align}
Yet two seconds later I realized what a nonsense it was and that I fell victim of one of the freshman's dreams.
I would greatly appreciate a hint for a proof or a counterexample if the statement turns out to be false.
This is not true. Let $a_n = n + \log n$, $b_n = n$, $f(x) = e^x$. Then $\lim \frac{a_n}{b_n} = 1$, but $\frac{f(a_n)}{f(b_n)} = e^{\log n} = n$ does not converge to 1. This is an issue with uniform continuity, I think.
Edit: Scratch that, uniform continuity is not sufficient either. But if $f$ is continuous and $a_n, b_n$ are bounded, and $f$ is bounded away from zero, it may be possible, but that's a lot of conditions.