Proof that sample variance is biased in presence of autocorrelation

24 Views Asked by At

With no correlation we can show sample variance is unbiased:

$$E[s^2] = E\left(\frac{\sum^n_{i=1}(X_i - \bar{X})^2}{n-1}\right) = \sigma^2$$ Proof $$E\left(\sum^n_{i=1}(X_i - \bar{X})^2\right) = E\left(\sum^n_{i=1}X_i^2 -2\bar{X}\sum^n_{i=1}X_i + n \bar{X}^2 \right ) = E\left(\sum^n_{i=1}X_i^2 -2\bar{X}n\bar{X} + n \bar{X}^2 \right ) = \sum E(X_i^2) - E(n \bar{X}^2)$$ $$=n\sigma^2 + n\mu^2 - \sigma^2 - n\mu^2 = (n-1)\sigma^2$$

However, if there is autocorrelation among $x_i$, which part of this proof is no longer correct?

1

There are 1 best solutions below

4
On BEST ANSWER

$$\operatorname{E}[\bar X^2] = \frac{\sigma^2}{n} + \mu^2$$ is no longer true if there is autocorrelation because $$\operatorname{E}[X_i X_j] \ne \operatorname{E}[X_i]\operatorname{E}[X_j]$$ for $i \ne j$ when there is correlation.