If $m$ is the number of success in $n$ independent trials, in which the probability of success is $p$, then how to prove that $\frac mn$ converges in probability to $p$ as $n\to\infty$.
2026-04-01 03:59:39.1775015979
On
Convergence in probability for binomial distribution
7k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
You can use the weak law of large numbers (or even the strong version, but it's not necessary here).
Let $X_1,X_2,\dots,X_n$ a sequence of i.i.d. variables such that $P(X_i=1)=1-P(X_i=0)=p$.
Then the number of success in $n$ trials is given by $\sum_{i=1}^n X_i$. Applying the weak law of large numbers leads to $\frac{1}{n}\sum_{i=1}^n X_i$ converging in probability to $E(X_1)=p$.
I will add some details and a direct proof. You can just skip the parts that are obvious for you. We can represent $m(n)$ by a random variable $Y_n\sim \mathrm {Bin}(n,p),$ that is $Y_n$ is the number of success in $n$ independent trials with success probability $p.$ In other words, $$Y_n=\sum\limits_{i=1}^n X_i,$$ where $X_1,\ldots,X_n\stackrel{iid}{\sim}\mathrm{Ber}(p),$ that is, $X_i$ is $1$ with probability $p$ and $0$ with probability $(1-p).$ From this, we can deduce $$\mathbb{E}\left[\frac{Y_n}{n}\right]=\frac{n}{n}\mathbb{E}[X_1]=p$$ and $$\mathrm{Var}\left(\frac{1}{n}Y_n\right)=\frac{1}{n^2}\mathrm{Var}\left(Y_n\right)=\frac{n}{n^2}\mathrm{Var}\left(X_1\right)=\frac{p(1-p)}{n}.$$ Now consider a random variable $X$ with finite second moment $\mathbb{E}[X^2]<\infty.$ Take some $\varepsilon>0.$ Then $$\mathbb{P}[|X-\mathbb{E}[X]|\geq \varepsilon]=\mathbb{E}[\mathbf{1}_{|X-\mathbb{E}[X]|\geq \varepsilon}]=\mathbb{E}[\mathbf{1}_{\frac{(X-\mathbb{E}[X])^2}{\varepsilon^2}\geq 1}]\leq\frac{\mathbb{E}[(X-\mathbb{E}[X])^2]}{\varepsilon^2}=\frac{\mathrm{Var}(X)}{\varepsilon^2}.$$ This is Chebyshev's inequality by the way.
Let us now go back to our original problem. To prove convergence in probability, we have to show that for all $\varepsilon>0,$ $$\mathbb{P}[|Y_n/n-p|\geq \varepsilon]\xrightarrow{n\to\infty}0.$$ By what I just showed, $$\mathbb{P}[|Y_n/n-p|\geq \varepsilon]\leq\frac{p(1-p)}{n\varepsilon^2}\xrightarrow{n\to\infty}0,$$ so we are done.
Note that all of this applies to all variables of the form $Y_n=\sum\limits_{i=1}^n X_i$ with $X_1,\ldots,X_n$ iid with $\mathbb{E}[X_i^2]<\infty,$ not just to binomial random variables. This is the weak law of large numbers, that states that the average of random variable with finite expectation will converge to the expectation in probability. We can even lift the assumption that they have finite second moment. This is a consequence of Etemadi's Law of large numbers. However, the proof is a bit lengthy so I will leave you with the classical weak law of large numbers. All in all, you have