Why does the strong law of large numbers require random variables with the same variance?

444 Views Asked by At

The strong law of large numbers states that for an infinite sequence of i.i.d random variables $X_1, X_2, ...$, and $\bar{X_n} = \frac{1}{n}(X_1 + \cdots + X_n)$,

$$\bar{X_n} \rightarrow \mu \text{ for } n \rightarrow \infty$$

I can understand why there is the requirement for $\operatorname{E}[X_1] = \operatorname{E}[X_2] = \cdots = \mu$, and for them to be the same type of distribution (e.g. normal, exponential); but why is it important that they all have the same variance? If $n \rightarrow \infty$, why should it matter how large or small each variance is relative to each other? The variances should 'disappear' as we take the empirical average over $n \rightarrow \infty$, shouldn't they?

1

There are 1 best solutions below

0
On BEST ANSWER

The variance controls how likely and how much the variable is to be away from their mean. So the bounded variance is here to ensure that this deviation is controlled.

Indeed, ask yourself what is likely to happen if $X_n$ has variance $e^{n^2}$ (or any rapidly-growing sequence for that matter). Intuitively, there are going to be huger deviations at each step, without any chance of compensation.

To be more precise, let $M_n$ be the mean of the $n$ first terms. If $M_n$ converges, then $X_{n+1}=(n+1)M_{n+1}-nM_n=O(n+1)$ as, that is $X_n=O(n)$ as.

Of course, this cannot happen in the above example since $P(X_n \geq e^{n^2/2})$ is a positive constant.