Although replated to probability theory, this statement seemed completely of real analysis to me.
I am trying to prove that if $X_n$ is a sequence of random variables and $X$ is another random variable such that
$\forall\epsilon>0 \ ,\ \sum\limits_1^\infty P(|X_n-x|>\epsilon)<\infty\implies X_n\xrightarrow{a.s.} x$
My proof:
$$\forall\epsilon>0 \ ,\ \sum\limits_1^\infty P(|X_n-X|>\epsilon)<\infty\\\implies \forall\epsilon>0 \ ,\ P(\limsup_{n\to\infty}|X_n-X|>\epsilon)=0\ \ \cdots\cdots(Borel-Cantelli)\\\implies \forall\epsilon>0 \ ,\ P(|X_n-X|>\epsilon \text{ for infinitely many } n )=0\\\implies P(X_n\to X \text{ as } n\to\infty)=1\\\implies X_n\xrightarrow{a.s.} X$$
Is everything rigorously correct? Especially can I put a double equivalence sign in going from the second to the third line?