If $\hspace{0.2cm}X_1,\ldots,X_n\hspace{0.2cm}$ are iid random variables with $\hspace{0.2cm}\mathbb{E}(X_i)=\mu\hspace{0.2cm}$ and $\hspace{0.2cm}Var(X_i)=\sigma^2<+\infty$
Let $\bar{X_n}=\left(\frac{1}{n}\sum_{i=1}^nX_i\right)$ be the partial averages and define $$S_n^2=\frac{1}{n-1}\sum_{i=1}^n(X_i-\bar{X}_n)^2$$
How to prove that $\hspace{0.2cm}\mathbb{E}(S_n^2)=\sigma^2\hspace{0.2cm}$ ?
We have
$E(S^2)=E\left[\frac{1}{n-1}\sum_{i=1}^n (X_i-\overline X )^2\right]$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\overline X)^2 \right] \quad $
Subtract and add $\mu$. This is a common 'trick'.
$=\frac{1}{n-1}E\left[\sum_{i=1}^n \left[(X_i-\mu)-(\overline X-\mu) \right]^2 \right] \quad$
multipliying out
$=\frac{1}{n-1}E\left[\sum_{i=1}^n \left[(X_i-\mu)^2-2(\overline X-\mu)(X_i-\mu)+(\overline X-\mu)^2 \right]\right] \quad$
writing for each summand a sigma sign
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-2(\overline X-\mu)\sum_{i=1}^n(X_i-\mu)+\sum_{i=1}^n(\overline X-\mu)^2 \right] \quad$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-2(\overline X-\mu)\color{blue}{\sum_{i=1}^n(X_i-\mu)}+n(\overline X-\mu)^2 \right] \quad$
transforming the blue term
$\sum_{i=1}^n(X_i-\mu)=n\cdot \overline X-n\cdot \mu$
Thus $2(\overline X-\mu)\color{blue}{\sum_{i=1}^n(X_i-\mu)}=2(\overline X-\mu)\cdot (n\cdot \overline X-n\cdot \mu)=2n( \overline X- \mu)^2$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-2n( \overline X- \mu)^2+n(\overline X-\mu)^2 \right] \quad$
$=\frac{1}{n-1}E\left[\sum_{i=1}^n (X_i-\mu)^2-n( \overline X- \mu)^2\right] \quad$
$=\frac{1}{n-1}\left[\sum_{i=1}^n E\left[(X_i-\mu)^2\right]-nE\left[( \overline X- \mu)^2\right]\right] \quad$
It remains to know what $E\left[(X_i-\mu)^2\right]$ and $E\left[( \overline X- \mu)^2\right]$ is. Are all steps up to here comprehensible?