Meaning of the convergence of the sum of a sequence of random variables to a constant

105 Views Asked by At

Suppose you have a sequence of independent real-valued random variables $X_1, X_2, . . .$ and let $S_n = \sum_{i=0}^nXi$

What does it mean for $(1/n)S_n$ to converge to a constant c?

From what I understand $(1/n)S_n$ needs to be unbiased and independent but I'm not sure of the actual meaning of the convergence.

I think that $S_n$ converges to a normal distribution under some conditions also- maybe using the weak law of large numbers?

1

There are 1 best solutions below

0
On

In the weak law of large numbers, the convergence is in probability: $P \left ( \left | \frac{S_n}{n} - \mu \right |>\epsilon \right ) \to 0$ as $n \to \infty$, for any fixed $\epsilon>0$. Intuitively this means that for large $n$, a histogram of samples of $\frac{S_n}{n}$ (for all the same value of $n$) will look like a peak near $\mu$. The subtlety with formalizing that is that how large $n$ needs to be depends on both the distance threshold and the probability threshold that you impose.

In the strong law of large numbers, the convergence is almost sure: $P \left ( \frac{S_n}{n} \to \mu \right ) = 1$. Intuitively this means that if you plot $\frac{S_n}{n}$ as a function of $n$, you are essentially guaranteed to see that it goes to $\mu$.

Both hold when $X_i$ are independent and identically distributed with finite mean, regardless of any assumptions about higher moments.

The central limit theorem is something else. It appears when you look at $\frac{S_n-n\mu}{\sqrt{n}}$, so you center everything and scale down the fluctuations by only a factor of $\sqrt{n}$ instead of $n$ (so the fluctuations do not simply vanish in the limit). It also involves convergence in distribution instead, which is easy to formalize (a sequence of random variables $X_n$ converges in distribution to some random variable $X$ if the sequence of CDFs $F_{X_n}$ converges pointwise to the CDF $F_X$).