Showing $\sqrt{S^2}$ is a consistent estimator of standard deviation $\sigma$.

1.7k Views Asked by At

Given $X_1,\dots,X_n\stackrel{iid}{f}(x;\mu,\sigma^2)$ with population mean $\mu$ and population standard deviation $\sigma$. I want to show $\sqrt{S^2}$ is a consistent estimator for $\sigma$. So if I read my book correctly, we want to show $$\lim_{n\rightarrow\infty}P(|\sqrt{S_n^2}-\sigma|>\epsilon)=0\quad\forall\epsilon >0 $$

So invoking Chebyshev's inequality we have

$$ \begin{align*} P((\sqrt{S_n^2}-\sigma)^2>\epsilon^2)\leq&\frac{\mathbb{E}(\sqrt{S_n^2}-\sigma)}{\epsilon^2} \\ ={}&\frac{\mathrm{Var}(\sqrt{S_n^2})}{\epsilon^2}\\ ={}&\frac{\mathbb{E}[(\sqrt{S_n^2})^2]-[\mathbb{E}(\sqrt{S_n^2})]^2}{\epsilon^2}\\ ={}&\frac{\mathbb{E}(S_n^2)-[\mathbb{E}(\sqrt{S_n^2})]^2}{\epsilon^2}\\ ={}&\frac{\sigma^2-[\mathbb{E}(\sqrt{S_n^2})]^2}{\epsilon^2}\text{ (Since $S^2$ is an unbiased estimator)} \end{align*} $$ This is where I get stuck because I know nothing about the distribution besides it being $\textit{iid}$, so I have no idea what to do with the $[\mathbb{E}(\sqrt{S^2})]^2$ term. I know I need to have a term with $n$ to take a limit, but I am unsure if I have ever taken the expected value of a sample standard deviation or if I even know how to properly. Any help or guidance would be appreciated. Just to save time Jensen's inequality is something that I have not learned in the course yet.

2

There are 2 best solutions below

0
On BEST ANSWER

I have done a little more research - here is my answer.

Given $X_1,\dots,X_n\stackrel{iid}{f}(x;\mu,\sigma^2)$, we know that by the weak law of large numbers:

$$\frac{1}{n}\sum_{i=1}^{n}(X_i-\bar{X})^2\stackrel{p}{\longrightarrow} \sigma^2$$

By the continuous mapping theorem, the square root function preserves continuity, thus

$$\sqrt{\frac{1}{n}\sum_{i=1}^{n}(X_i-\bar{X})^2}\stackrel{p}{\longrightarrow} \sqrt{\sigma^2}$$

$$\Rightarrow \sqrt{S^2}\stackrel{p}{\longrightarrow}\sigma$$

It was more theory heavy and that is where I am weak, this has been a good learning experience.

6
On

Henry said it all, but I will elaborate here. The sample variance is \begin{equation} S^2 = \frac{1}{n-1} \sum_{i=1}^n \left( X_i - \overline{X} \right)^2, \end{equation} and since $X_i \sim f(x;\mu,\sigma)$, the population variance is \begin{equation} \sigma^2 = \frac{1}{n} \sum_{i=1}^n \left( X_i - \mu \right)^2. \end{equation} It is obvious that $\mathbb{E}(\overline{X}) = \mu$ since $X_i \sim f(x;\mu,\sigma)$, \begin{equation} \lim_{n\to\infty} \frac{\sigma^2}{S^2} = \lim_{n\to\infty} \frac{n-1}{n} = 1. \end{equation}