Assume $i.i.d$ random variables $X_i$ with $E(X_i)=0$ and $E(|X_i|^m)<\infty$ for all $m$. The central limit theorem states that
$$ Y_n :=\frac{1}{\sqrt{n}}\sum_{i=1}^n X_i \stackrel{d}{\to} Z $$ where $Z\sim N(0,1)$. Let $p>0$. The continous mapping theorem then implies $$|Y_n|^p \stackrel{d}{\to} |Z|^p.$$
I need to show that for any $p>0$. I have, as $n\to \infty$, $$E(|Y_n|^p) \to E(|Z|^p).$$
I know that we need to show uniform integrability, which is implied if the following condition hold $$\sup_{n}E(|Y_n|^{p+\epsilon})< \infty.$$
Does this hold for any $p>0$ in the given $i.i.d.$ case? Note that it is sufficient to check the condition either for even values of $p+\epsilon$ (eliminating the absolute value).
For $p<2$ the statement is correct. We have \begin{align*} E(|Y_n|^2)& = E(\frac{1}{n} (\sum_{i=1}^n X_i)^2)\\ & = \frac{1}{n} E( \sum_i X_i^2 + 2 \sum_{i\neq j}X_iX_j )\\ & = E(X_i^2) \end{align*} since $E(X_iX_j)= E(X_i)E(X_j)=0$
for $p<4$ it should hold too: $(\sum_{i=1}^n X_i)^4$ will contain $n$ summands of the form $X_i^4$, and $n(n-1)$ summands of the form $X_i^2X_j^2$ for all other summands there exists a $j$ such that the summand contains $X_j^1$ (hence the expectation of these terms will vanish again). It then would follow $$E(|Y_n|^4) \leq C \frac{n(n-1)}{n^2} < \infty$$
I tried a similar argument for $p=10$ and if I did not made any mistakes by looking at the cases, the largest numbers of terms entering the expectation of $(\sum_{i=1}^n Y_i)^{10}$ should occure when any $5$ of the $n$ variables enter as a product of the form $X_i^2 X_j^2 X_k^2 X_l^2 X_m^2$ for which there exists $n (n-1)(n-2)(n-3)(n-4) \leq n^5$ possibilities. Hence: $$E(|Y_n|^{10}) \leq C \frac{n^5}{n^5} < \infty$$
My observations suggest that the claim holds for all $p>0$. And, chosing $p$ as an even number, the largest number of terms entering the sum seem to be given by a factor of $n(n-1)(n-2) \cdots (n-p/2-1)$.
As you note, it is sufficient to look at large even moments of $Y_n$. Fix some integer $k$. Then $$E[Y_n^{2k}] = \frac{1}{n^{k}}\sum\limits_{a_1 + \cdots + a_m = 2k}\sum\limits_{1\leq j_1 < j_2 < \cdots < j_m \leq n} \binom{k}{a_1,a_2,\ldots,a_m} E[X_{j_1}^{a_1}] \cdots E[X_{j_m}^{a_m}]$$
Where the first sum is over all compositions of $2k$. Note that if any $a_i = 1$, then the expectation is zero, so it suffices to restrict to when each $a_i \geq 2$. Moreover, note that the first sum is finite (as in the number of terms doesn't depend on $n$, so it is sufficient to show that $$\frac{1}{n^k}\sum\limits_{1\leq j_1 < j_2 < \cdots < j_m \leq n} \binom{2k}{a_1,a_2,\ldots,a_m} E[X_{j_1}^{a_1}] \cdots E[X_{j_m}^{a_m}]$$ is bounded for any composition $a_1 + \ldots + a_m = 2k$ with each part $\geq 2$. If any $a_i$ is strictly greater than $2$, then there are less $k$ parts total, and we may bound the sum by $$\leq \frac{n^m}{n^k} \binom{2k}{a_1,\ldots,a_m}M^m$$ where $M = \sup_{a \leq 2k} E[|X|^a].$ Since $m < k$, this goes to zero as $n$ goes to infinity. Thus, we have that the only term remaining is when all $a_i$'s are equal to $2$, in which case all of the moments $E[X_{j_i}^{a_i}] = 1$, so the sum becomes $$\frac{1}{n^k}\sum\limits_{1\leq j_1 < j_2 < \cdots < j_m \leq n} \binom{2k}{2,2,\ldots,2} E[X_{j_1}^{2}] \cdots E[X_{j_m}^{2}] = \frac{1}{n^k}\binom{n}{k}\binom{2k}{2,2,\ldots,2} \to \frac{1}{k!}\binom{2k}{2,2,\ldots,2}$$ which is bounded.