Law of Large Numbers (LLN) relation to Binomial Distribution

198 Views Asked by At

I am currently reading a book which explains the efficiency of Ensemble methods in statistical learning. In the following I modified the statements in that there is no need to understand anything about statistical learning.

The author considers a slightly biased coin toss ($51\%$ heads). He states that when you throw the coin $1000$ times, there will be a majority of heads about $72\%$ of the time. This is obviously just computed by setting $X\sim \mathrm{Bin}(1000,0.51)$ and calculating $1-F_X(500)$.

Then the author states that this is due to the LLN, e.g. the ratio of heads will asymptotically reach $51\%$ for a coin toss.

I am wondering if this is really a reasonable explanation. In which way is there a connection between the LLN and the computation I did above? I really can't see a non trivial reason to explain it by using the LLN.

1

There are 1 best solutions below

2
On BEST ANSWER

Let $p>0$ denote the probability on heads and let $X_n$ denotes the number of heads that show up by $2n$ throws.

Then according to WLLN: $$\lim_{n\to\infty}P\left(\left|\frac{X_n}{2n}-p\right|>\epsilon\right)=0\text{ for every }\epsilon>0\tag1$$or equivalently:$$\lim_{n\to\infty}P\left(\left|\frac{X_n}{2n}-p\right|<\epsilon\right)=1\text{ for every }\epsilon>0\tag2$$

Further we have: $$P\left(n<X_{n}\right)=P\left(\frac{1}{2}-p<\frac{X_{n}-2np}{2n}\right)\geq P\left(\left|\frac{X_{n}}{2n}-p\right|<p-\frac{1}{2}\right)\tag3$$ where the LHS is the probability on a majority of heads by $2n$ throws.

Here $p-\frac12>0$ so according to $(2)$ the RHS of $(3)$ will converge to $1$.