Let $(X^{i})_{i=1,...,N}$ be iid random variables.
Why does the empirical standard deviation $\hat{\sigma}$ satisfy $E\hat{\sigma} = E\lvert X^{i}-E[X^{i}]\rvert$?
The empirical standard deviation is defined $\hat{\sigma}:=\sqrt{\frac{1}{N-1}\sum\limits_{i=1}^{N}(X^{i}-\overline{X})^{²}}$ where $\overline{X}$ is the empirical mean.
I have of course seen the proof that $E\hat{\sigma}^{2}=\sigma^{2}$. Nonetheless, I am having difficulty proving the variant above.
All I can get:
$\sqrt{\frac{1}{N-1}\sum\limits_{i=1}^{N}(X^{i}-\overline{X})^{²}}=\sqrt{\frac{1}{N-1}\sum\limits_{i=1}^{N}(X^{i})^{2}-\frac{N}{N-1}\overline{X}^{²}}$
Is there any way to proceed from here? Particularly the fact that there are absolute values signs as well as an expectation makes me confused.
Not true.
Simple example: $P(X_i=1)=P(X_i=-1)=\frac{1}{4}$ and $P(X_i=0)=\frac{1}{2}$ Therefore $E(X_i)=0$ and $\sigma^2(X_i)=E(X_i^2)=\frac{1}{2}$ or $\sigma=\frac{1}{\sqrt{2}}$. However your formula gives $\sigma=E(|X_i|)=\frac{1}{2}$.
Note that $X_i^2=|X_i|$