Prove that the probability of convergence of a sequence of independent random variables(Let's say $\{X_n\}$) is equal to zero or one. (Kai Lai Chung: Page82. 12, A course in probability theory)
If $\{X_n\}$ is a sequence of independent and identically distributed random variables not constant a.e., then $\mathbb P\{X_n\text{ converges}\}=0$. (Kai Lai Chung: Page82. 13, A course in probability theory)
I am trying to solve the above exercises. We should first show that $\mathbb P\{X_n\text{ converges}\}=0$ or $1$. Indeed, it suffices to show that for every $\varepsilon>0$, $\{|X_n-X_{n'}|>\varepsilon:n,n'\in\mathbb{N}, n\ne n'\}$ are pairwise independent. Then by the corollary of Borel-Cantelli Lemma, we will be able to show the first exercise. However, this seemingly obvious fact is giving me a hard time to formulate it rigorously.
They are not pairwise independent: e.g. $X_1 - X_2$ and $X_1 - X_3$ are (in general) not independent, their covariance is the variance of $X_1$. What is true is that the event that the sequence converges is independent of $(X_1, X_2, \ldots, X_N)$ for any $N$.