This appeared in example (b), Section 8.8 of An Introduction to Probability Theory and Its Applications, Vol 2, by Feller.
Let $F(x)$ be the cumulative distribution function of some random variable.
Assume $1-F(x)$ varies regularly with an exponent $\rho<0$. So $1-F(x)=x^{\rho}L(x)$, where $L(x)>0$ varies slowly, meaning that for any $x>0$: $$\lim_{t\to\infty}\frac {L(tx)}{L(t)}=1.$$
Slowly varying functions have this property that: $$x^{-\epsilon}\leq L(x)\leq x^\epsilon$$ for any fixed $\epsilon>0$ and all $x$ sufficiently large.
The author says it is possible to find an increasing sequence $\{a_n\}$ such that $n(1-F(a_n))$ converges to $1$ as $n\to\infty$. I had some trouble understanding why it is possible.
I can see that if I let $a_n$ increase fast (say $a_n=n^{-2/\rho}$), then $n(1-F(a_n))$ converges to $0$. If I let $a_n$ increase slowly (say $a_n=n^{-0.5/\rho}$), then $n(1-F(a_n))$ goes to $+\infty$. Does this mean that it is possible to choose $a_n$ such that $n(1-F(a_n))$ converges to $1$ (or any other positive number)?