As the title says. I think this should follow straightforwardly but I can't find a proof.
My random variable of interest $X$ takes values in the non-negative integers. The only other assumption on its distribution is that $E(X)<\infty$. I want to prove: $$\lim_{n\to\infty}n\Pr(X\ge n) = 0.$$ The fact that this should follow is referenced e.g. by DeGroot (2004) "Optimal Statistical Decisions" p. 295, but no proof is given.
All I have right now is that without the constant $n$ it is easy to prove using Markov's inequality: $$\Pr(X\ge n) \le \frac{1}{n}E(X) \to 0.$$ I appreciate any help in figuring this out.
This fact should be true for any monotonically decreasing sequence $a_n$ with $\sum_{i=1}^\infty a_i<\infty$. Recall the Cauchy Condensation test which says that $\sum_{i=1}^\infty a_i<\infty$ converges iff $\sum_{i=1}^\infty 2^i a_{2^i}<\infty$ converges, so that we have $2^na_{2^n}\rightarrow 0$ and by monotonicity if we let $k(n):=\log_2(n)$ then $0\leq na_n \leq 2^{k(n)}a_{2^{k(n)}}$ which implies $na_n\rightarrow 0$.
Now use the fact that $\sum_{i=1}^\infty P(X\geq i)=E(X)$ for nonnegative integer random variables.