Estimating the variance of the sample mean

236 Views Asked by At

I have been reading up on bias and estimators from my textbook and I reached a section where the author proves $s^2$ to be an unbiased estimator for $\sigma^2$, the population variance of a random variable $X$. I understand the proof but this got me thinking. What about $\overline{X}$ instead? It's a sampling distribution, so it must have a variance $\sigma_{X}^2$, namely $\dfrac{\sigma^2}{n}$, but how can I find an unbiased estimator for $\sigma_{X}^2$?

1

There are 1 best solutions below

0
On BEST ANSWER

If $X$ has mean $\mu$ and variance $\sigma^2$, then $$\bar X = \frac{1}{n} \sum_{i=1}^n X_i$$ where the $X_i$ are iid as $X$, will have mean $\mu$ and variance $\sigma^2/n$; this is what you stated and follows directly from the linearity of expectation. Note we do not require independence for the mean to be $\mu$, but we do require it for the variance of the sample mean (otherwise there will be nonzero covariance terms).

Consequently, if $s^2$ is unbiased for $\sigma^2$, then $s^2/n$ will be unbiased for $\sigma^2/n$: $$\operatorname{E}[s^2/n] = \operatorname{E}[s^2]/n = \sigma^2/n.$$