Consider about the sum of iid random variables $$ X_t = \sum_{i=1}^{t}x_i $$ $x_t$ are of zero mean and finite variance $\sigma^2$. Let $M,m>0$. By Chebyshev's inequality, it is easy to show that $$ H_t(M)\equiv P(|X_t|>tm+M)<\frac{\sigma t}{(tm+M)^2} $$ That is, $P_t(M)$ converges with the same pace as $1/t$.
Then I am interested in the probability of following event $$ G_t(M) \equiv P((t+1)m+M \ge|X_t|>tm+M) $$ which should be much smaller than $H_t(M)$ and converge faster. But how fast it can be? Especially, can we make $$ \sum_{t=1}^{\infty} H_t(M) $$ convergent?