I was looking at a proof of the string law of large numbers, and am having trouble finding where the proof uses the assumption that the random variables are identically distributed. I'll reproduce the proof below, but the link can be found here (https://www.math.ucdavis.edu/~tracy/courses/math135A/UsefullCourseMaterial/lawLargeNo.pdf). To start, we have Chebyshev's inequality and the easy side of Borel-Cantelli (both of which I understand).
I'll save time by assuming the independent real-valued random variables $(X_{k})_{k \in \mathbb{N}}$ on probability space $(\Omega, \mathcal{F}, \mu )$ all have expected values of $0$. Further, we'll assume that $E[ X_{k}^{2} ], E[ X_{k}^{4} ]$ are uniformly bounded from above by $C$. Then $\lim_{k \to \infty} k^{-1} S_{k} ( \omega ) = 0$ for almost all $\omega \in \Omega$.
Let $S_{k} = \sum_{j = 1}^{k} X_{k}$. If $\omega \in \Omega$ is such that $\lim_{k \to \infty} k^{-1} S_{k}( \omega ) \neq 0$, then there exists $\epsilon > 0$ such that $|S_{k} ( \omega ) | > \epsilon $ for infinitely many $k$. If we can show that $\sum_{k = 1}^{\infty} \Pr ( |S_{k} | > k \epsilon ) < \infty$ for any arbitrary $\epsilon > 0$, then we can drop Borel-Cantelli and be done.
Now, let us consider $E [ S_{k}^{4} ]$. Then
\begin{align*} E \left[ S_{k}^{4} \right] & = E \left[ \sum_{1 \leq j_{1}, \ldots , j_{4} \leq k } \prod_{i = 1}^{4} X_{j_{i}} \right] \\ & = \sum_{ 1 \leq j_{1}, \ldots, j_{4} \leq k } E \left[ \prod_{i = 1}^{4} X_{j_{i}} \right] \end{align*}
Now, we can ignore the terms of four distinct subscripts, as well as those of the form $E \left[ X_{j_{1}}^{3} X_{j_{2}} \right]$, as they'll come to $0$. So we have $k$ terms of the form $E \left[ X_{j}^{4} \right]$, and $3k(k - 1)$ terms of the form $E \left[ X_{j_{1}}^{2} X _{j_{2}}^{2} \right]$, where $j_{1} \neq j_{2}$. Thus
\begin{align*} E \left[ S_{k}^{4} \right] & = \left( \sum_{j = 1}^{k} E \left[ X_{j}^{4} \right] \right) + \left( \sum_{1 \leq j_{1}, j_{2} \leq k } E \left[ X_{j_{1}}^{2} \right] E \left[ X_{j_{2}}^{2} \right] \right) \\ & \leq kC + 3k(k - 1) C^{2} \\ & \leq D k^{2} \end{align*}
for some $D > 0$. Now, invoking Chebyshev's inequality for $p = 4$ yields
\begin{align*} \Pr ( | S_{k} | > k \epsilon ) & \leq (k \epsilon )^{-4} D k^{2} \\ & = (D / \epsilon^{4} ) k^{-2} \\ \Rightarrow \sum_{k = 1}^{\infty} \Pr ( | S_{k} | > k \epsilon ) & \leq (D / \epsilon^{4} ) \sum_{k = 1}^{\infty} k^{-2} \\ & < \infty \end{align*}
Thus we can use Borel-Cantelli and call it a day. However, I cannot find which part of this proof utilized the assumption of identical distribution of the random variables. If anybody could demonstrate it to me, I'd greatly appreciate it. Also, I apologize if this has typos: I'm writing this on my phone. Thanks.
It's unnecessary to assume identical distributions. I believe it suffices that your $X_i$ have the same first and second moments.