Intuition of law of iterated logarithm

1.5k Views Asked by At

Let $X_i$ be iid random variables with $EX_i = 0$ and $Var X_i=1$ and $S_n=X_1+\cdots+X_n$. Then the law of the iterated logarithm says almost everywhere we have

$$\limsup_{n\to\infty}\frac{S_n}{\sqrt{n\log{\log{n}}}} = \sqrt{2}$$

On the other hand the central limit theorem says

$$\frac{S_n}{\sqrt{n}} \to N(0,1)$$

Can anyone explain why dividing by an extra $\sqrt{\log{\log{n}}}$ should go from giving $N(0,1)$ to something bounded by the constant $\sqrt{2}$?

To try to understand I considered the simple case when each $X_n$ is $N(0,1)$ so that $S_n/\sqrt{n}$ is also normally distributed as $N(0,1)$. Then $S_n/\sqrt{n\log{\log{n}}}$ is distributed as $N(0,1/\log{\log{n}})$. Then it would seem to me that to even have just $\limsup_{n\to\infty}\frac{S_n}{\sqrt{n\log{\log{n}}}} \le \sqrt{2}$ requires either

$$\sum_{n=3}^\infty P\left(\frac{S_n}{\sqrt{n\log{\log{n}}}} > \sqrt{2}\right) < \infty$$

or if

$$\sum_{n=3}^\infty P\left(\frac{S_n}{\sqrt{n\log{\log{n}}}} > \sqrt{2}\right) = \infty$$

then to achieve $\limsup_{n\to\infty}\frac{S_n}{\sqrt{n\log{\log{n}}}} \le \sqrt{2}$ the sets {$ \omega : \frac{S_n}{\sqrt{n\log{\log{n}}}} > \sqrt{2}$} cannot for example cover the probability space over and over infinitely forever. I don't know the value of $\sum_{n=3}^\infty P\left(\frac{S_n}{\sqrt{n\log{\log{n}}}} > \sqrt{2}\right)$ but since it is the sum of the probability of the tail ends of a bunch of normal distributions you would expect there to be no closed form even for partial sums.

In the other direction for $\limsup_{n\to\infty}\frac{S_n}{\sqrt{n\log{\log{n}}}}$ to not have a value lower than $\sqrt{2}$ isn't it necessary that something like the following holds

$$\sum_{n=3}^\infty P\left(\sqrt{2}-\epsilon < \frac{S_n}{\sqrt{n\log{\log{n}}}} \le \sqrt{2}\right) = \infty$$

Can anyone explain why this number $\sqrt{2}$ should pop up?

1

There are 1 best solutions below

2
On

The main difference is that a convergence is almost sure and the other one is in distribution. To grasp how both can occur simultaneously, consider an independent sequence of random variables $(\xi_n)$ such that $P(\xi_n=\sqrt{2\log\log n})=1/n$ and $P(\xi_n=0)=1-1/n$.

Then $\xi_n\to0$ in distribution and, by Borel-Cantelli lemma, $$ \limsup_{n\to\infty}\frac{\xi_n}{\sqrt{\log\log n}}=\sqrt2\quad \text{almost surely}. $$