I have a fair background in probability but am very inexperienced when it comes to limiting behavior of random variables. This being the case, I request some guidance for this problem:
If $\{X_n\}^\infty_{n=1}$ are i.i.d., then prove that for all fixed $p > 0$,
E$\{|X_1|^p\} \leq \infty$ if and only if
$X_n = o(n^{1/p})$ a.s.
Based off what I read on wikipedia, it seems the last line is saying that as $n$ increases, $X_n$ is essentially increasing at log speed (and it seems like $p$ doesn't matter much here). I suspect the last line is a typo since each $X_n$ is i.i.d. Perhaps it instead should be $\sum_{i=1}^n X_n$ on the left hand side? Please let me know if my understanding is correct here, as well as if there is a typo.
With that out of the way, I am not sure how to proceed. Based on related examples in the book, I am led to believe that there is some well known inequality which can be applied here. From there, I don't know how to end up with the expression $o(n^{1/p})$. Is there a general course of action that may be taken with problems such as these?
$E|X_1|^{p} <\infty$ if and only if $\sum P(|X_1|^{p} > n) <\infty$. Applying this to $\frac {X_1} {\epsilon^{1/p}}$ we see that $E|X_1|^{p} <\infty$ if and only if $\sum P(|X_1|^{p} > \epsilon n) <\infty$. This is equivalent to the condition $\sum P(|X_n|^{p} > \epsilon n) <\infty$. By Borel-Cantelli Lemma this is true if and only if the probability that $|X_n|^{p} > \epsilon n$ for infinitely many $n$ is $0$. This is true for any $\epsilon >0$. Thus $E|X_1|^{p} <\infty$ if and only if $n^{-1/p}X_n \to 0$ almost surely.