This is an exercise our professor gives to us during an exam. I did it partially, but now I would like to solve it completely.
Show that for a sequence of random variable $\{X_n \}_{n \ge 1}$ and for a real number $r > 2$ $$\forall \epsilon >0, \quad \lim_{n\rightarrow \infty} \frac{1}{s_n^r} \sum_{j=1}^n \mathbb{E}(|X_j|^r 1_{|X_j|>\epsilon s_n})=0$$ if and only if $$\lim_{n\rightarrow \infty} \frac{1}{s_n^r} \sum_{j=1}^n \mathbb{E}(|X_j|^r)=0$$ where $s^2_n = \sum_{j=1}^n \mathbb{E}(X_j^2)$.
Progress so far:
("$\Leftarrow$") This direction is straightforward. We have $$\mathbb{E}(|X_j|^r) \ge \mathbb{E}(|X_j|^r1_{|X_j|>\epsilon s_n}),$$ and so the result follows.
("$\Rightarrow$"). In this direction I am not even sure whether the implication is true. Also, I was wondering whether the assumption $\mathbb{E}(X_j) = 0$ is necessary or not. $$$$
Edit: Forgot a square root at some point, corrected it the way Dave Giraudo suggested.
Let $\epsilon > 0$. We have
$$\frac{1}{s_n^r} \sum_{j=1}^n \mathbb{E}(|X_j|^r)=\frac{1}{s_n^r} \sum_{j=1}^n \mathbb{E}(|X_j|^r 1_{|X_j|>\epsilon s_n}) + \mathbb{E}(|X_j|^r 1_{|X_j|\leq\epsilon s_n})$$ By assumption, we dont have to worry about the first term on the right hand side. Using that $|X_j| \leq \epsilon s_n$ for all terms in the second sum and the monotonicity of the expectation, we get
$$\frac{1}{s_n^r} \sum_{j=1}^n\mathbb{E}(|X_j|^r 1_{|X_j|\leq\epsilon s_n}) \leq \frac{1}{s_n^r} \sum_{j=1}^{n} \epsilon^{r-2}s_{n}^{r-2} \mathbb{E}(X_j^2) = \epsilon^{r-2}$$
As $\epsilon \to 0$, $\epsilon^{r-2} \to 0$ as well which concludes the proof.