Explanation about proof of strong law of large numbers in Tracy's notes

58 Views Asked by At

There are many threads and answers on the Proof of the Strong Law of Large numbers. But my question is a specific example in these notes by Craig A. Tracy. The relevant parts are (page 3):

let $X_1, X_2, X_3, \dots$ denote an infinite sequence of independent random variables with common distribution. Set $S_n = X_1 + \dots + X_n$. […] (Take, for instance, in coining tossing the elementary event $\omega = HHHH\ldots$ for which $S_n(\omega) = 1$ for every $n$ and hence $\lim_{n\to\infty} S_n(\omega)/n = 1$.)

My question is: Is $S_n(\omega) = 1$ wrong?

Analysis: I guess $\omega$ denotes the event of constant, consecutive heads (as opposed to tails).

$$ S_n(\omega) = X_1(\text{HH}\ldots) + X_2(\text{HH}\ldots) + \ldots + X_n(\text{HH}\ldots) $$

The probability of a single heads result is $\frac12$. The probability of $k$ heads results is $\frac1{2^k}$. $S_n(\omega)$ should therefore be $n \cdot \frac1{2^k}$. This is different from his claim $S_n(\omega) = 1$.

If we assume $S_n(\omega) = 1$, I think the limes is also wrong.

$$ \lim_{n \to \infty} \frac{S_n(\omega)}{n} = \lim_{n\to\infty} \frac{1}{n} = 0 \neq 1 $$

But my claim $S_n(\omega) = n \cdot \frac{1}{2^k}$, is not correct either:

$$ \lim_{n \to \infty} \frac{S_n(\omega)}{n} = \lim_{n\to\infty} \frac{n \cdot \frac{1}{2^k}}{n} = \frac{1}{2^k} \neq 1 $$

If we assume $k = n$, we get $S_n(\omega) = 0$ as well. $k$ must be $0$ to hold true, which would be pointless. So where am I wrong here?