Error In Proving Variance Of Sample Mean

169 Views Asked by At

So the variance of the sample mean $\bar X$ is $\sigma^2/n$, and I understand how to prove it through the formula $Var(nX) = n^2Var(X)$. However, I was approaching it through another method, and seem to have proven that the variance of the sample mean is $\sigma^2$, and I can't seem to find the error in my proof:

$ Var(\bar X) = E[\bar X^2] - E[\bar X]^2 = E[\bar X^2] - \mu^2 = E[(1/n\sum X_i)^2] - \mu^2 = 1/n^2 E[(\sum X_i)^2] - \mu^2 = 1/n^2 (n^2E[X^2]) - \mu^2 = Var(X) $.

I've been looking for a while, but I can't find what's wrong with this. If someone could help point out the error in this, that would be very helpful, thanks!

1

There are 1 best solutions below

2
On

The problem is with $E \left[ \left(\sum X_i \right)^2 \right]$ Let $Y=\sum X_i.$

We know $E[Y]= n \mu.$ We also know $\mathrm{Var} (Y)= n \sigma^2.$ That means $E[Y^2]=n \sigma^2 + n^2 \mu^2.$

Then plugging into your equation we find $\mathrm{Var} (\bar X ) = {1 \over n^2} \left[ n \sigma^2 + n^2 \mu^2 \right] - \mu^2 = {\sigma^2 \over n}$