Almost sure convergence of average of random variables

165 Views Asked by At

In my statistical inference course exercise guide, I am confronted with the following problem:

Let $0<\theta<1/2$, and define the sequence $\{X_n\}_{n\in\mathbb{N}}$ of discrete independent random variables as follows: $X_n$ takes the values $n^{\theta}$ and $-n^{\theta}$ with probabilities $P(X_n=n^{\theta})=1/2=P(X_n=-n^{\theta})$. Show that $$\frac{1}{n}\sum_{k=1}^{n}X_k \to 0$$ almost surely.

My attempt: I try to use the Borel-Cantelli lemma. If $\overline{X}_n$ denotes the average of the first $n$ variables, it would suffice to show that for every $\epsilon >0$ it holds that $$\sum_{n=1}^{\infty}P(|{\overline{X}_n}| \geq \epsilon) < \infty \tag{1}$$ A quick computation tells us that for each $n$, we have $E[X_n] = 0$ and $\operatorname{Var}(X_n) = n^{2\theta}$, which implies $E[\overline{X}_n]=0$ and $$\operatorname{Var}(\overline{X}_n) = \frac{1}{n^2}\sum_{k=1}^{n}\operatorname{Var}(X_k) = \frac{1}{n^2}\sum_{k=1}^{n}k^{2\theta} \tag{2}$$ If we use Chebyshev's inequality, plugging $(2)$ into $(1)$ would yield $$\sum_{n=1}^{\infty}P(|{\overline{X}_n}| \geq \epsilon) \leq \sum_{n=1}^{\infty}\frac{1}{\epsilon^2}\operatorname{Var}(\overline{X}_n) \leq \sum_{n=1}^{\infty}\frac{1}{\epsilon^2}\frac{1}{n^2}\sum_{k=1}^{n}k^{2\theta}$$ The last term of the previous inequality seems quite divergent. Any suggestions?