Let $\newcommand\top{\overset p\to}\newcommand\isd{\overset d=}\newcommand\P{\mathcal P}\DeclareMathOperator\var{Var}$$X\isd X_1\isd x_2\isd\ldots\isd X_n$ be independent stochastic variables with same distribution for which the central moments $\mu_k'=E((X-E(X))^k)$ exist. (And suppose $E(X)$ exists.) An exercise in our course notes asks to prove that
the central sample moments $m_k'=\frac 1n((X_1-\bar X_n)^k+\cdots+(X_n-\bar X_n)^k)$ are consistent and asymptotically unbiased estimators for the $\mu_k'$.
($\bar X_n$ denotes the sample mean $\frac1n(X_1+\cdots+X_n)$.)
Recall "$T_n$ is asymptotically unbiased for $t$" means $E(T_n)\to t$, and "$T_n$ is consistent for $t$" means $\P(|T_n-t|>\varepsilon)\to0$ for all $\varepsilon>0$ (that is, $T_n$ converges in probability to $t$, denoted $T_n\top t$).
I was able to do the consistency part, based on the Weak Law of Large Numbers (WLLN) which says $\bar X_n\top E(X)$ for i.i.d. $X_l$'s:
$$\frac1n((X_1-\bar X_n)^k+\cdots+(X_n-\bar X_n)^k)=\sum_{j=1}^k\binom kj(-\bar X_n)^{k-j}\frac1n\sum_{l=1}^nX_l^j.$$
By WLLN, each $\frac1n\sum_{l=1}^nX_l^j\top E(X^j)$ and $\bar X_n\top E(X)$ and hence the whole thing $$\top\sum_{j=1}^k\binom kj (-E(X))^{k-j}E(X^j)=E((X-E(X))^k).$$
(Recall $A_n\top A$ and $B_n\top B$ implies $A_n+B_n\top A+B$ and $A_nB_n\top AB$.)
Proving the asymptotic unbiasedness seems much harder: $$E\left(\frac {(X_1-\bar X_n)^k+\cdots+(X_n-\bar X_n)^k}n\right)\overset?\to E((X-E(X))^k).$$
I have some ideas but can't make a proof from it:
- I know that $E(T_n)\to t$ implies $T_n\top t$ (provided $\var(T_n)\to0$). I don't know if the reverse is true, at least I can't invert the proof of that implication.
- We have $E(\frac1n(X_1^k+\cdots+X_n^k))=\frac1n\cdot n\cdot E(X^k)=E(X^k)$. We can't however do the same because each $k$th power $(X_l-\bar X_n)^k$ involves every other $X_l$ too.
- Perhaps we can use the same strategy with Newton's binomial expansion as above, but I don't think there's a general rule which says that $E(A_n)\to a$ and $E(B_n)\to b$ implies $E(A_nB_n)\to ab$.
Since at this point in the notes the concept of convergence in distribution isn't introduced yet, I don't expect to need it.
The Strong Law of Large Numbers (for i.i.d. $X_l$: $\bar X_n\to E(X)$ almost everywhere provided $E(X)$ exists) has been mentioned without proof but can be used, I suppose.
I couldn't find a proof on the internet. How can I prove it?
Newton's binomial expansion is a good idea. First, using the fact that the sample is i.i.d., we have $\mathbb E(m'_k)=\mathbb E(X_n-\overline{X_n} )^k$. Since $$(X_n-\overline{X_n} )^k=\left(\left(1-\frac 1n\right)X_n-\frac 1n\sum_{j=1}^{n-1}X_j\right)^k =\sum_{l=0}^k\binom kl\left(1-\frac 1n\right)^{k-l}X_n^{k- l} \left(-\frac 1n\sum_{j=1}^{n-1}X_j\right)^l,$$ we can conclude taking the expectation, using independence and the fact that $$\lim_{n\to \infty}\mathbb E\left(-\frac 1n\sum_{j=1}^{n-1}X_j\right)^l=(-1) ^{l} (\mathbb E[X_1])^l.$$