I state the very famous strong law of large numbers in it's simplest form:
Given an IID sequence of random variables $\{ X_n\}_{n \in N}$ then
$$ \lim_{n \rightarrow \infty} \frac{1}{n} \sum_{i= 1}^n X_i = E[X_0] \qquad a.s.$$
I am wondering if the sequence given by $Y_n = \frac{1}{n} \sum_{i= 1}^n X_i $ converges almost surely oscillating to the mean (for every $n_1 \in N$ s.t. $E[Y_{n_1}] > E[Y_0]$ there exists a $n_2> n_1$ s.t. $E[Y_{n_2}] <E[Y_0]$ , and vice-versa).
In particular how would one go about proving such a statement?
We consider the case $E(X_1^2)<\infty$. We reformulate the problem, by defining $U_i=X_i-E(X_i)$. So, we wish, to show that $\sum_{i}^nU_i$ oscillates from zero.
We know that $\limsup_{n}\frac{U_n}{\sqrt{n}}=\infty$, almost surely. One way to see this is by using the CLT theorem and Kolmogorov's zero-one law.
By symmetry, we also obtain that $\liminf_{n}\frac{U_n}{\sqrt{n}}=-\infty$
More specifically, we have that $\limsup_{n} U_n=\infty$ and $\liminf_{n} U_n=-\infty$ almot surely. $\square$