I have trouble solving following problem:
Let $$X_1,...,X_n$$ be independently,identically distributed random variables with
the Riemann-density $$f$$ on a probability space $$(\Omega,\mathcal{A},P).$$
Moreover denote $$\sigma^2=Var(X_1)$$ and let $$\int_{\Bbb R}x^2 f(x)dx<\infty.$$
We have to show that:
$$E[\frac{1}{n-1}\sum_{i=0}^n(X_i-\frac1n\sum_{j=1}^nX_j)^2]=\sigma^2.$$
I would really appreciate any help and I hope everything is correctly written.
Greetings
First assume that $E[X_i]=0$, then expand out the square. You'll need to be able to calculate $E[X_i X_j]$ for $i=j$ as well as $i \neq j$. To handle the case $E[X_i]=\mu \neq 0$, consider
$$X_i - \frac{1}{n} \sum_{j=1}^n X_j = (X_i - \mu) - \frac{1}{n} \sum_{j=1}^n (X_j-\mu)$$
which reduces the problem to the previous case.