Laws of large numbers and independence

1.1k Views Asked by At

I just did a brief review of various sources, and they all specify that if $X_i$'s are independent, identically distributed random variables, then $S_n/n \rightarrow E(X_i)$ (with respect to various notions of convergence). Do the various $X_i$'s really need to be independent?

3

There are 3 best solutions below

6
On BEST ANSWER

To construct some cases where they are dependent and still the convergence still holds is possible, so I suppose you are looking for a case of dependence when it does not hold.

What about setting $X_0=N(0,1)$. Now, for all $i$, repeatedly sample from $N(0,1)$ for some variable $Y_i$ until $sgn(Y_i)=sgn(X_{i-1})$ and set $X_i=Y_i$. Then consider the sum of the $X_i$'s.

Edit 1:

Even simpler. Set $X_0=-1$ with probability $1/2$, and $X_0=+1$ with probability $1/2$. Now set $X_i=X_{i-1}$. Then $S_n/n\rightarrow \pm 1$ and $E[X_i]=0$.

Note that the theorem is for $S_n/n$ and not $E[S_n]/n$.

Edit 2, in response to Did's comment:

Let $X_0=-1$ with probability $1/2$, and $X_0=+1$ with probability $1/2$ and set $X_i=-X_{i-1}$, now $S_n/n \rightarrow0=E[X_i]$

In conclusion, when you have dependent RVs, it is possible to have cases where the LLN holds, but to have other cases where the LLN does not hold.

0
On

No. For instance, if you have a stationary time series with autocovariances $\gamma_j$ satisfying $0<\sum_j \gamma_j^2<\infty$ then one has convergence, also, since

$$V\Bigl( \frac{1}{n}\sum_{t=1}^n X_t \Bigr)= \frac{1}{n^2} \sum_{t=1}^n \sum_{s=1}^n \gamma_{t-s} \to 0,$$

as $n\to\infty.$

1
On

The Birkhoff Ergodic Theorem can be considered to be a generalization of the Strong Law of Large Numbers to identically distributed but dependent sequences.