Let $\{X_n, Y_n, n \in \mathbb{N}\}$ be a sequence of independent random variables, where for any $n \in \mathbb{N}$, $X_n$ and $Y_n$ have identical distributions. Prove that $X_n+Y_n \rightarrow 0$ almost surely if and only if $X_n \rightarrow 0$ almost surely and $Y_n \rightarrow 0$ almost surely.
I know how to prove the sufficiency, but I encountered some difficulties when proving the necessity. How can the almost sure convergence of the sum of random variables imply the almost sure convergence of individual random variables?
Suppose that $X_n$ does not converge almost surely to $0$. This means there exists $\varepsilon>0$ and a measurable set $A$ such that $\mathbb P(A)>0$ and $\vert X_n\vert>\varepsilon$ on $A$ for infinitely many $n\in\mathbb N$. Up to considering $(-X_n,-Y_n)$ instead of $(X_n,Y_n)$, we can consider that $X_n>\varepsilon$ for $n\in I\subset\mathbb N$ where $I$ is infinite.
Then $\sum_{n\in\mathbb N}\mathbb P(Y_n>\varepsilon)=\sum_{n\in\mathbb N}\mathbb P(X_n>\varepsilon)\ge\sum_{n\in I}\mathbb P(A)=+\infty$, so by Borel-Cantelli's lemma we get that $X_n>\varepsilon$ and $Y_n>\varepsilon$ almost surely for infinitely many $n$. We deduce that $X_n+Y_n$ cannot converge almost surely to 0, which proves the claim