I am learning about Moments in the context of Probability
For some given probability distribution function the theoretical $k$-th moment is given by : $$\int_{\mathbb R} x^k \cdot f(x) \mathrm{d}x$$
Given some measurements $x_1, x_2... x_n$ - regardless of our assumption on their probability distribution, the sample $k$-th moment is given by: $$\frac{1}{n} \sum_{i=1}^n x_i^k$$
My Question: I have often heard people (informally) say that when the number of measurements increases in size, the sample moment converges to the theoretical moment - but I have never seen a mathematical proof for this.
Is it possible to prove that :
$$ \frac{1}{n} \sum_{i=1}^n x_i^k \xrightarrow{} \int_{\mathbb R} x^k \cdot f(x) \mathrm{d}x $$
Thanks!
As hinted by geetha290krm in the comments, $X_i$ iid implies $X^k_i$ are iid.
The (weak) LLN states:
$$\frac1n \sum{Y_i} \xrightarrow{p} E[Y_i] \text{ as }n\to\infty,\; Y_i\text{ are iid}$$
Here, we let $Y_i=X_i^k$, and assuming the moments exist, we apply the LLN above to show convergence of sample moment to the theoretical moment.