I am trying to clarify these two concepts - and understand the differences between the Central Limit Theorem (https://en.wikipedia.org/wiki/Central_limit_theorem) and the Weak Law of Large Numbers (https://en.wikipedia.org/wiki/Law_of_large_numbers).
As an example, suppose I have a coin and I don't know the true probability of Heads or Tails - I start to flip the coin again and again:
- The Law of Large Numbers states that if I flip this coin enough times, I will get an estimate of the true probability of getting a Heads
- The Central Limit Theorem states that as I flip the coin again and again, the distribution for the probability of getting a Heads will follow a Standard Normal Distribution
Is my understanding of this correct?
Thanks!
Yes. The Khinchine LLN teaches us that for an independently and identically distributed (iid) random variable, with (unobservable) finite expected value and variance, the simple average of the variable converges in probability to the true expected value. In your example of a coin toss with 1 for heads, as each flip has the same distribution and is independent of other flips, and as the simple mean (proportion of 1s) equals the probability in classic statistics, then when the number of flips goes to infinity, the mean of your random variable converges in probability to its expected unobservable true value. Remember that there are other forms of LLN: Kolmogorov theorems, Markov and Ergodicity.
The Linderberg-Levy CLT, teaches us that for an iid sample of a variable with finite expected value and variance, $\sqrt{n}(\bar{X}_n - \mu)\rightarrow_d N(0,\sigma^2)$ where $\bar{X}_n=\dfrac{1}{n}\sum x_i$. Notice that the value of $\bar{X}_n$ depends on $n$, i.e. the sample size. In other words, the random variable converges in distribution to the normal, and this is true independently of the initial distribution.