Let $(X_i)_{i\in\mathbb{N}}$ be a sequence of i.i.d. real random variables (with finite variance if needed). For $k>0$ fixed, I'm interested in estimating the quantity $$ p_N(k):=\mathbb{P}(\exists\,i=1,\dots,N\text{ s.t. }|X_i-\mathbb{E}[X_i]|\geq k\,N) \;.$$
Using the Markov inequality, if $X_1$ has finite variance, it can be shown that $\sup_{N\in\mathbb{N}} p_N(k)\to0$ as $k\to\infty$.
I would like to prove that, for $k$ large enough, $\sum_{N=1}^\infty p_N(k) <\infty$. Is it true?
If $X_0$ has a finite variance it is true. Using the inequality $(1-t)^N\geqslant 1-tN$ for $0\leqslant t\leqslant 1$ and independence, we are reduced to prove that the series $\sum_j c_j$ is convergent, with $$c_j:= j\cdot\mu\{|X_0-\mathbb E(X_0)|\geqslant kj\}.$$ Define $A_j:=\{kj\leqslant |X_0-\mathbb E(X_0)|\lt (j+1)k\}$. Then $$\sum_{n=1}^\infty c_n=\sum_{n=1}^\infty n\sum_{j\geqslant n}\mu(A_j)\leqslant \sum_j j^2\mu(A_j)\leqslant \frac 1{k^2}\sum_j\int_{A_j}|X_0-\mathbb E(X_0)|^2\mathrm d\mu=\frac1{k^2}\operatorname{Var}(X_0).$$