Need Check on Proof with Chebychev's Inequality: Statistical Theory

78 Views Asked by At

So I have a problem with my homework, and I just need to see if my proof and thinking are correct. The problem I have is this: Show that the sample variance $s^2$ is a consistent estimator for the variance of X, $\sigma^2$. I had to manipulate the $\sigma^2$ shortcut formula to get $s^2=\sigma^2\frac{n}{n-1}$.

However, my professor gave us a hint to use the theorem of consistent estimators for moments of X.

This theorem goes: Let X be a random variable with $k^{th}$ moment $\mathrm{E}[X^k]$ unknown. Then $\frac{\sum_iX_i^k}{n}$ is a consistent estimator for $\mathrm{E}[X^k]$.

So I am thinking that when I get to my proof where I have $\frac{n}{n-1}(\mathrm{E}[X^2]-\mathrm{E}[X]^2)$, is where I can say that by this theorem that since $\mathrm{E}[X^2]$ and $\mathrm{E}[X]$ are consistent estimators for $\mathrm{E}[X^k]$ moment, that $s^2$ is a consistent estimator for $\sigma^2$.

Could anyone let me know if my thinking is correct on this? Sorry for my code not being so clean, this is my first time really programming in LaTeX. Thanks!

1

There are 1 best solutions below

1
On

I guess $s^2$ is the unbiased estimator for the variance $\frac{1}{n-1}\sum_{i=1}^n (x_i-\bar x)$. It is consistent, if it converges in probability to $\sigma^2$, which means that $\lim_{n\to\infty}P(|s^2-\sigma^2|\geq\epsilon)=0$ for any $\epsilon>0$. Now you know by your Professor's theorem that the estimator $s_n^2=\frac{1}{n}\sum_{i=1}^n (x_i-\bar x)$ is consistent, i.e. $\lim_{n\to\infty}P(|s_n^2-\sigma^2|\geq\epsilon)=0$. Using that it shouldn't be too hard to show, that the other limit also goes to zero.