Strong Law of Large Numbers with different n

78 Views Asked by At

In this problem, we want to show that $\frac{X_1+...+X_n}{\sqrt{n}(log(n))^{\frac{1}{2}+\epsilon}}$ converges to 0 almost surely as n approaches infinity.

We know mean = 0 and variance = 1 and that X1... are iid random variables.

I followed the standard proof of the law of large numbers using Kolmogorov maximal inequality.

I let $A_n = (max|\sum_{i=m}^{k}\frac{X_i}{\sqrt{i}(log(i))^{\frac{1}{2}+\epsilon}}| \ge \epsilon)$

Then I do $P(A_n)\le\frac{1}{\epsilon^2}\sum_{i=m}^{n}\frac{var(X_i)}{(\sqrt{i}(log(i))^{\frac{1}{2}+\epsilon})^2}=\frac{1}{\epsilon^2}\sum_{i=m}^{n}\frac{1}{(\sqrt{i}(log(i))^{\frac{1}{2}+\epsilon})^2}$

Where I am stuck is using the integral test to find a sequence reliant on m that converges to 0 as m approaches infinity that is greater than or equal to that final term on the RHS above, which would thus prove that the RHS = 0 and therefore everything else on the LHS equals 0 as well.

1

There are 1 best solutions below

0
On BEST ANSWER

We will use the following standard result:

Theorem. The Bertrand series $\sum_{k=1}^\infty \frac{1}{k^\alpha\ln^\beta k}$ converges if, and only if, $\alpha > 1$ or ($\alpha=1$ and $\beta > 1$).

Here, your RHS is of the form $\frac{1}{\epsilon^2}\sum_{k=m}^n \frac{1}{k \ln^{1+2\varepsilon} k}$. By the above theorem, the series $\sum_{k=1}^\infty \frac{1}{k \ln^{1+2\varepsilon} k}$ converges, since $1+2\varepsilon > 1$. So your RHS is upper bounded by $$ \frac{1}{\epsilon^2}\sum_{k=m}^n \frac{1}{k \ln^{1+2\varepsilon} k} \leq \frac{1}{\epsilon^2}\sum_{k=m}^\infty \frac{1}{k \ln^{1+2\varepsilon} k} \xrightarrow[m\to\infty]{} 0 $$ where the limit follows from the fact that the remainders of a convergent series go to $0$.