Does the sequence of r.v. given by the law of large numbers converge almost surely to the mean in an oscillatory fashion?

66 Views Asked by At

I state the very famous strong law of large numbers in it's simplest form:

Given an IID sequence of random variables $\{ X_n\}_{n \in N}$ then

$$ \lim_{n \rightarrow \infty} \frac{1}{n} \sum_{i= 1}^n X_i = E[X_0] \qquad a.s.$$

I am wondering if the sequence given by $Y_n = \frac{1}{n} \sum_{i= 1}^n X_i $ converges almost surely oscillating to the mean (for every $n_1 \in N$ s.t. $E[Y_{n_1}] > E[Y_0]$ there exists a $n_2> n_1$ s.t. $E[Y_{n_2}] <E[Y_0]$ , and vice-versa).

In particular how would one go about proving such a statement?

2

There are 2 best solutions below

0
On BEST ANSWER

We consider the case $E(X_1^2)<\infty$. We reformulate the problem, by defining $U_i=X_i-E(X_i)$. So, we wish, to show that $\sum_{i}^nU_i$ oscillates from zero.

We know that $\limsup_{n}\frac{U_n}{\sqrt{n}}=\infty$, almost surely. One way to see this is by using the CLT theorem and Kolmogorov's zero-one law.

By symmetry, we also obtain that $\liminf_{n}\frac{U_n}{\sqrt{n}}=-\infty$

More specifically, we have that $\limsup_{n} U_n=\infty$ and $\liminf_{n} U_n=-\infty$ almot surely. $\square$

0
On

There is the following characterization for random walks:

Let $S_n = \sum_{i=1}^n X_i$ be a random walk with iid increments $X_i \in L^1$. Then $\mathbb{E}(X_1)=0$ if, and only if, $$-\infty = \liminf_{n \to \infty} S_n < \limsup_{n \to \infty} S_n = \infty \quad \text{a.s.}$$

In particular, we find that that any random walk $S_n = \sum_{i=1}^n X_i$ satisfying $\mathbb{E}(X_1)=0$ will oscillate between $- \infty$ and $\infty$. This implies that

$$\frac{1}{n} \sum_{i=1}^n X_i < 0 = \mathbb{E}(X_1) \quad \text{for infinitely many $n$}$$

and

$$\frac{1}{n} \sum_{i=1}^n X_i > 0 = \mathbb{E}(X_1) \quad \text{for infinitely many $n$}.$$

This proves the "oscillating convergence" to the mean $\mathbb{E}(X_1)$ for the case $\mathbb{E}(X_1)=0$.

For the general case, i.e. if $X_1$ does not have expectation zero, we can apply the above reasoning to the shifted increments

$$Y_i := X_i- \mathbb{E}(X_i)$$

to obtain that

$$\frac{1}{n} \sum_{i=1}^n (X_i-\mathbb{E}(X_1)) < 0 \quad \text{for infinitely many $n$}$$

and

$$\frac{1}{n} \sum_{i=1}^n (X_i-\mathbb{E}(X_1)) > 0 \quad \text{for infinitely many $n$}.$$

This is clearly equivalent to saying that

$$\frac{1}{n} \sum_{i=1}^n X_i < \mathbb{E}(X_1) \quad \text{for infinitely many $n$}$$

and

$$\frac{1}{n} \sum_{i=1}^n X_i > \mathbb{E}(X_1) \quad \text{for infinitely many $n$}.$$