How to prove the convergence of the series of variances theorem in this case?
Let $ \{X_n\}_{n\ge1} $ be a sequence of independent random variables with $ \mathbf{E}X_n = 0 $ for all $ n\ge1 $ and let for some $ p \ge 1 $ \begin{equation} \sum\limits_{n=1}^{\infty} \frac{\mathbf{E}|X_n|^{2p}}{n^{p+1}} < \infty. \end{equation} Prove that $ \frac{S_n}{n} \to 0 $ almost surely as $ n \to \infty $.
I use the next $ \textbf{Theorem} $ with $ b_n = n $ :
Let $ \{X_n\}_{n\ge1} $ be a sequence of independent random variables and let for some $ \varepsilon > 0 $ next conditions hold: \begin{equation} \sum\limits_{n=1}^{\infty} \mathbf{P}(|X_n| \ge \varepsilon b_n) < \infty, \end{equation} \begin{equation} \frac{1}{b_n}\sum\limits_{j=1}^{n} \mathbf{E}Y_j \to 0 \quad \text{as}\quad n \to \infty , \quad Y_j = X_j \mathbf{I}\{ X_j \le \varepsilon b_j \}, \end{equation} \begin{equation} \sum\limits_{n=1}^{\infty} \frac{\mathbf{Var}Y_n}{b_n^2} < \infty. \end{equation} Then $ \frac{S_n}{b_n}\to 0 $ a.s.
I proof the first and the second conditions using exercise's condition. But I can't prove the third. I tried to estimate variance by this way: \begin{equation} \mathbf{Var}Y_n \le \mathbf{E}[X_n^2; \; |X_n| \le n] = \sum\limits_{k=0}^{n-1}\mathbf{E}[X_n^2; \; k < |X_n| \le k+1] \le \sum\limits_{k=0}^{n-1}\frac{1}{k^{2p-2}}\mathbf{E}[X_n^{2p}; \; k < |X_n| \le k+1] \end{equation} But nothing worked.