Decay rate for the probability of deviating from the mean for the sample mean

33 Views Asked by At

Let $X_1,X_2, \ldots$ be independent and identically distributed random variables, with finite mean $\mu$. No assumption is made on higher order moments: they may be finite or infinite. Then, the law of large numbers asserts that $$ \textbf{(1)} \qquad P\left(\left|\frac{1}{n}\sum_{i=1}^b X_i - \mu\right|>\epsilon\right) \to 0, \quad n\to \infty, \quad \forall \epsilon>0. $$ In fact, convergence of $\frac{1}{n}\sum_{i=1}^n X_i$ to $\mu$ holds in a stronger form, i.e. with probability one, and this can be shown resorting to the first Borel-Cantelli lemma and a coupling argument for sums of truncated random variables, as done e.g. in

Billingsley (1995) Probability and Measure, Third edition, Wiley, pp. 282--283.

Yet, such arguments seem not to provide an explicit decay rate to $0$ for the probability on the left-hand side of $\textbf{(1)}$.

QUESTION Under the above general assumptions, is it possible to conclude that $$ \textbf{(2)} \qquad P\left(\left|\frac{1}{n}\sum_{i=1}^b X_i - \mu\right|>\epsilon\right) = O(r_n), \quad n\to \infty, \quad \forall \epsilon>0. $$ for some $r_n =O(1/\log n)$? Clearly, assuming that the $X_i$'s have a finite second moment, Chebychev's inequality immediately allows to claim that $r_n=1/n$, but what if no further assumption is made on higher order moments?