I have the following problem:
Suppose that $\lim_{n \to \infty} a_n = 0$. Prove that for any $x$ $$\lim_{n \to \infty} \left(1+a_n \frac{x}{n}\right)^n = 1.$$
I have tried replacing $a_n$ with a function $a(n)$ and applying L'Hopital's Rule but got no useful result. I also tried going directly to the definitions and showed that $\lim_{n \to \infty} (1+a_n \frac{x}{n}) = 1$, but I don't know how to deal with the $n$th power (that makes the limit $\lim_{n \to \infty} \exp(n(1+a_n \frac{x}{n}))$ indeterminate).
Fix $x\in\mathbf R$.
By Bernoulli's inequality (here $n$ has to be large enough, so that $a_n x/n>-1$), $$ \Bigl(1+\frac{a_nx}{n}\Bigr)^n\geq 1+a_n x. $$ On the other hand, since $$ \Bigl(1+\frac{a}{n}\Bigr)^n\leq e^a $$ for all $a\in\mathbf R$, $$ \Bigl(1+\frac{a_nx}{n}\Bigr)^n\leq e^{a_n x}. $$ Now use the sandwich theorem.