Variance of sample mean (problems with proof)

867 Views Asked by At

Assuming that I have $\{x_1,\ldots, x_N\}$ - an iid (independent identically distributed) sample size $N$ of observations of random variable $\xi$ with unknown mean $m_1$, variance (second central moment) $m_{c_2}$ and second raw moment $m_2$. I try to use sample mean $\overline{x}=\frac{1}{N}\sum_{i=1}^Nx_i$ as an estimator of the true mean.


So I want to find its’ bias and the variance. Everything is simple with bias but not with the variance. First I use the fact of independency: $$\mathbb{var}[\overline{x}]= \mathbb{var}\left[ \frac{1}{N}\sum_{i=1}^Nx_i \right]\stackrel{\text{iid}}{=} \frac{1}{N^2}\sum_{i=1}^N\mathbb{var}\left[x_i \right]= \frac{m_{c_2}}{N}$$

At the same time I can use the representation of variance in terms of raw first and second moments: $$\mathbb{var}[\overline{x}]= \mathbb{E}[(\overline{x}-\mathbb{E}[\overline{x}])^2]= \mathbb{E}[\overline{x}^2]-\mathbb{E}[\overline{x}]^2= \mathbb{E}[\overline{x}^2]-m_1^2$$

Then $$ \begin{eqnarray} \mathbb{E}[\overline{x}^2]&=&\mathbb{E}\left[\left( \frac{1}{N}\sum_{i=1}^Nx_i \right)\left( \frac{1}{N}\sum_{j=1}^Nx_j \right)\right]= \mathbb{E}\left[\frac{1}{N^2}\sum_{i=1}^N\sum_{j=1}^Nx_i x_j \right]= \\ &=&\mathbb{E}\left[\frac{1}{N^2}\sum_{i=1}^Nx^2_i+\frac{1}{N^2}\sum_{i=1,i\neq j}^N\sum_{j=1}^Nx_i x_j \right]\stackrel{\text{iid}}{=}\frac{m_2}{N} \end{eqnarray} $$

So $$\mathbb{var}[\overline{x}]=\frac{m_2}{N}-m_1^2$$

Which clearly doesn’t match what I’ve got earlier. So where’s the catch?

2

There are 2 best solutions below

1
On BEST ANSWER

Corrigendum

$$\mathbb{var}[\overline{x}]=\mathbb{var}\left[ \frac{1}{N}\sum_{i=1}^Nx_i \right]\stackrel{\text{iid}}{=} \frac{1}{N^2}\sum_{i=1}^N\mathbb{var}\left[x_i \right]=N \cdot \frac{m_{c_2}}{N^2}=\frac{m_{c_2}}{N}$$

$Cov({x_i},{x_j}) = E\left[ {{x_i}{x_j}} \right] - E\left[ {{x_i}} \right]E\left[ {{x_j}} \right] = 0$ Since $\{x_1,\ldots, x_N\}$ iid :

$$\begin{eqnarray} \mathbb{E}[\overline{x}^2]&=&\mathbb{E}\left[\left( \frac{1}{N}\sum_{i=1}^Nx_i \right)\left( \frac{1}{N}\sum_{j=1}^Nx_j \right)\right]= \mathbb{E}\left[\frac{1}{N^2}\sum_{i=1}^N\sum_{j=1}^Nx_i x_j \right]=\\ &=&\mathbb{E}\left[\frac{1}{N^2}\sum_{i=1}^Nx^2_i+\frac{1}{N^2}\sum_{i=1,i\neq j}^N\sum_{j=1}^Nx_i x_j \right]\stackrel{\text{iid}}{=}\frac{m_2}{N}+\frac{N^2-N}{N^2}m_1^2 \end{eqnarray}$$

$$\mathbb{var}[\overline{x}]=\mathbb{E}[\overline{x}^2]-\mathbb{E}[\overline{x}]^2=\frac{m_2-m_1^2}{N}=\frac{m_{c2}}{N}$$

Updated: reduced to the final form according to the comments.

2
On

$$ [\frac{1}{N^2}\sum_{i = 1}^n\sum_{j \neq i}x_ix_j] = \frac{1}{N^2}\sum_{i = 1}^n\sum_{j \neq i}[x_i][x_j] $$

And this is not equal to zero.