Let $X_n$ be a sequence of identically distributed random variables such that $X_n$'s are pairwise uncorrelated and $\mathbb{E}(|X_1|)<\infty$.
Then, is it necessary that $\frac{X_1+...+X_n}{n} \rightarrow \mathbb{E}(X_1)$ a.s.? What would be a counterexample?
Proof of conjecture, assuming $E(X_1^2)$ is finite. To simplify writing let $Y_k=X_k-E(X_k)$, so the problem statement is $A_n=\frac{Y_1+Y_2+...+Y_n}{n}\to 0$ a.s. The proof involves showing the variance of $A_n\to 0$, which, since $E(A_n)=0$, means showing $E(A_n^2)\to 0$.
$E(A_n^2)=\frac{(\sum_{k=1}^nY_k)^2}{n^2}=\frac{\sum_{k=1}^nY_k^2+\sum\sum_{j\ne k}Y_jY_k}{n^2}$ Let $V=E(Y_1^2)$, then $E(A_n^2)=\frac{nV}{n^2}\to 0$, since $E(Y_kY_j)=0$, for $k\ne j$ (uncorrelated).