Why WLLN implies this convergence?

65 Views Asked by At

I'm trying to understand this theorem which shows the convergence of the sample standard deviation to the standard deviation parameter. I didn't understand why $\frac{1}{n}\sum X_i^2\to \sigma^2+\mu^2$ because of WLLN. It should be some small detail I'm not seeing.

2

There are 2 best solutions below

0
On

If $Y_1, \ldots, Y_n$ are i.i.d., the WLLN implies $\frac{1}{n} \sum_{i=1}^n Y_i$ converges in probability to $E[Y_1]$. Here, with $Y_i=X_i^2$ you have $E[Y_1]=E[X_1^2] = \text{Var}(X_1) + E[X_1]^2 = \sigma^2 + \mu^2$.

0
On

An alternative proof is to use Chebychev's inequality

$$\mathbb{P}\Big[|S_n^2-\sigma^2|\geq \epsilon\Big]\leq\frac{\mathbb{E}(S_n^2-\sigma^2)^2}{\epsilon^2}=\frac{\mathbb{V}(S_n^2)}{\epsilon^2}\xrightarrow{n\to \infty}0$$

this, with the sufficient condition that $\mathbb{V}(S_n^2)\xrightarrow{n\to \infty}0$


This proof is taken from Casella Berger, Chapter 5: Example 5.5.3