I am recently stuck by a weak LLN for arbitrary r.v.'s:
For any sequence of r.v.'s $\{X_n\}$, if $E(X_n^2) \rightarrow 0$, then $$ \frac{S_n-E(S_n)}{n}\rightarrow 0 \quad \text{in probability}, $$ where $S_n=\sum_{j=1}^nX_j$.
Without loss of generality, we assume $E(X_j)=0$ for each $j>0$. It follows from Chebyshev's inequality that, to prove the result above, we need to prove $$ E(S_n^2)=\sum_j E(X_j^2)+2\sum_{1\leq i<j\leq n}E(X_jX_k)=o(n^2). $$ By $E(X_n^2) \rightarrow 0$, it is clear that $E(X_n^2)$ is uniformly bounded. Thus the only trouble remained is $\sum_{1\leq i<j\leq n}E(X_jX_k)$.
How should I proceed?