Where $a$ and $b$ are constants.
I can think of it two different ways. First is that as $n$ goes to infinity, $\frac{b}{n}$ goes to $0$, so that we end up with $\lim_{n \to \infty}\frac{a}{\frac{b}{n}} = \frac{a}{0}$, which is undefined.
The other way is to say that $$\lim_{n \to \infty}\frac{a}{\frac{b}{n}} = \lim_{n \to \infty} a \cdot\frac{n}{b} = a \cdot \infty = \infty$$.
Which one is correct and why?
However it is written at the outset we are given the sequence $$x_n:={a\,n\over b}\qquad(n\geq1)\ ,$$ with the tacit assumption that $b\ne0$. If $a=0$ then $x_n=0$ for all $n$, hence $\lim_{n\to\infty}x_n=0$. If $a\ne0$ then we all know that the $x_n$ converge to $\infty$ if $ab>0$, and to $-\infty$, if $ab<0$. This means that the sequence is divergent in ${\mathbb R}$. Nevertheless we are entitled to write $$\lim_{n\to\infty}x_n=\infty\quad(ab>0),\qquad \lim_{n\to\infty}x_n=-\infty\quad(ab<0)\ ,$$ meaning that we accept $\pm\infty$ as limiting values, and have verified the corresponding convergence conditions.