I just did a brief review of various sources, and they all specify that if $X_i$'s are independent, identically distributed random variables, then $S_n/n \rightarrow E(X_i)$ (with respect to various notions of convergence). Do the various $X_i$'s really need to be independent?
2026-03-27 14:21:33.1774621293
Laws of large numbers and independence
1.1k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
3
To construct some cases where they are dependent and still the convergence still holds is possible, so I suppose you are looking for a case of dependence when it does not hold.
What about setting $X_0=N(0,1)$. Now, for all $i$, repeatedly sample from $N(0,1)$ for some variable $Y_i$ until $sgn(Y_i)=sgn(X_{i-1})$ and set $X_i=Y_i$. Then consider the sum of the $X_i$'s.
Edit 1:
Even simpler. Set $X_0=-1$ with probability $1/2$, and $X_0=+1$ with probability $1/2$. Now set $X_i=X_{i-1}$. Then $S_n/n\rightarrow \pm 1$ and $E[X_i]=0$.
Note that the theorem is for $S_n/n$ and not $E[S_n]/n$.
Edit 2, in response to Did's comment:
Let $X_0=-1$ with probability $1/2$, and $X_0=+1$ with probability $1/2$ and set $X_i=-X_{i-1}$, now $S_n/n \rightarrow0=E[X_i]$
In conclusion, when you have dependent RVs, it is possible to have cases where the LLN holds, but to have other cases where the LLN does not hold.