Convergence of sample central moments
Let there be $\{X_{n}\}_{n\geq 1}$, a set of iid r.v.'s with $\mathbb{E}(X_{i}) = \mu_{i}$ and $\mathbb{E}(\lvert X_{i} \rvert^{k}) < \infty$. Defining $\mu_{k} = \mathbb{E}((X_{i}-\mu)^{k})$ and $M_{k} = \frac{1}{n}\sum_{i=1}^{n}(X_{i}-\bar{X})^{k}$. Show that $M_{k} \xrightarrow[]{P} \mu_{k}$.
I found the following question about assymptotically unbiasedness that partially helped me with my doubt. However my goal is differente.
Central sample moments are asymptotically unbiased
My thoughts are using the binomial theorem as he did to show that each $(X_{i}-\bar{X})^{k}$ converge to a certain value and thus the sum divided by $n$ will converge by the weak law of large numbers. I'd appreciate some insights. Thank you very much.
Using the binomial formula, $$ M_{n,k}-\mu_k=\sum_{j=0}^k\binom kj\frac 1n\sum_{i=1}^n\left(X_i^j\left(-\overline{X}\right)^{k-j}-\mathbb E\left[X_i^j\left(-\overline{X}\right)^{k-j}\right]\right) $$ hence it suffices to show that for all $j\in\left\{0,\dots,k\right\}$, $$ \frac 1n\sum_{i=1}^n\left(X_i^j \overline{X} ^{k-j}-\mathbb E\left[X_i^j \overline{X}^{k-j}\right]\right)\to 0\mbox{ a.s.} $$ Observe that by the weak law of large numbers, $\overline{X}\to \mu$ in probability hence we are reduced to show that $$ \frac 1n\sum_{i=1}^n\left(X_i^j \mu^{k-j}-\mathbb E\left[X_i^j \overline{X}^{k-j}\right]\right)\to 0\mbox{ a.s.} $$ Applying the weak law of large numbers to the i.i.d. sequence $\left(X_i^j\right)_{i\geqslant 1}$, we are reduces to prove that $$ \frac 1n\sum_{i=1}^n \mathbb E\left[X_i^j \overline{X}^{k-j}\right]=\mathbb E\left[X_1^j\right]\mu^{k-j}. $$ Write $Y_{i,n}:=n^{-1}\sum_{l=1,l\neq i}^nX_l$. Then it remains to show that $$ \frac 1n\sum_{i=1}^n \mathbb E\left[X_i^j \left(\overline{X}^{k-j}-Y_{i,n}^{k-j}\right)\right]\to 0.$$