Consider $X_1,\dots,X_n$ to be independent random variables identically distributed (i.i.d) with mean $\mu$ and variance $\sigma^2$.
We have to show that the variance of the arithmetic mean, considered as an estimator is $$\mathbb{V}(\bar{X})=\frac{\sigma^2}{n}$$ Next, I will show you what I have done.
I know that I could say, $$\mathbb{V}(\bar{X})=\mathbb{V}\Big(\frac{1}{n}\sum_{i=1}^{n}X_i\Big)=\frac{1}{n^2}\mathbb{V}\Big(\sum_{i=1}^{n} X_i\Big)\overset{\text{i.i.d}}{=}\frac{1}{n^2}\sum_{i=1}^{n} \mathbb{V}(X_i)=\frac{\sigma^2}{n}$$ But I would like to find this result by using the definition of variance, $\mathbb{V}(\bar{X})=\mathbb{E}(\bar{X}^2)-\mathbb{E}(\bar{X})^2$. We already know that $\mathbb{E}(\bar{X})=\mu$. So, what I should find is $\mathbb{E}(\bar{X}^2)$. Now, $$\mathbb{E}(\bar{X}^2)=\mathbb{E}\Big(\Big(\frac{1}{n}\sum_{i=1}^{n}X_i\Big)^2\Big)=\frac{1}{n^2}\mathbb{E}\Big(\sum_{i=1}^{n}\sum_{j=1}^{n}X_iX_j\Big)=\frac{1}{n^2}\mathbb{E}\Big(\sum_{i=1}^{n}X_i\sum_{i=1}^{n}X_j\Big)\overset{\text{i.i.d}}{=}\frac{1}{n^2}\mathbb{E}\Big(\sum_{i=1}^{n}X_i\Big)\mathbb{E}\Big(\sum_{j=1}^{n}X_j\Big)=\frac{1}{n^2}\sum_i\mathbb{E}(X_i)\sum_j\mathbb{E}(X_j)=\frac{n^2\mu^2}{n^2}=\mu^2$$ And, so if this was correct, then $\mathbb{V}(\bar{X})=\mu^2-(\mu)^2=0$. Which I know it is not. Can anyone tell me where is my error? I think it is in the equality which says i.i.d but I don't know how to justify this is not true, because if $X$ and $Y$ are two random variables, then $$\mathbb{E}(XY)=\mathbb{E}(X)\mathbb{E}(Y)+\text{Cov}(X,Y)$$ thus, if $X$ and $Y$ are independent and identically distributed, then $$\mathbb{E}(n\bar{X}n\bar{Y})=n^2\mathbb{E}(\bar{X})\mathbb{E}(\bar{Y})$$ right? I'm pretty depressed right now.
$\mathbb{E}\Big(\sum_{i=1}^{n}X_i\sum_{i=1}^{n}X_j\Big)\overset{\text{i.i.d}}{=}\mathbb{E}\Big(\sum_{i=1}^{n}X_i\Big)\mathbb{E}\Big(\sum_{j=1}^{n}X_j\Big)$ is wrong. $\Big(\sum_{i=1}^{n}X_i\Big)$ and $\Big(\sum_{j=1}^{n}X_j\Big)$ are not independent.