Durrett 2.3.9: Let $d$ be a metric on the space of random variables on a given probability space. Show that the metric $d$ is complete. That is if $d(X_m, X_n)\to 0$ as $m,n\to\infty$, then there is a random variable $X$ such that $X_n\to X$ in probability as $n\to\infty$.
Previous exercise 2.3.8 proved that the metric $d$ is indeed a metric, and that $d(X_n, X)\to 0$ if and only if $X_n\to X$ in probability, which I managed to do successfully. To prove the given claim, I don't think the definition of $d$ matters, as long as we know the results from 2.3.8, so I'll leave that out.
Since {X_n} is Cauchy, it suffices to show that there is a convergent subsequence. In other words, we need to find a subsequence which converges in probability. Choose $n_k$ such that $P(\lvert X_{n_{k+1}}-X_{n_k}\rvert\geq 2^{-k})\leq 4^{-k}$. Then by Borel-Cantelli Lemma, we have $\lvert X_{n_{k+1}}-X_{n_k}\rvert\le 2^{-k}$ eventually a.s, in particular, $$X=X_{n_1}+\sum_{k=1}^\infty (X_{n_{k+1}}-X_{n_k})=\lim_{k\to\infty}X_{n_k}$$ converges a.s. Therefore $X_{n_k}\to X$ in probability and claim follows.
This answer is from here. Now could someone explain why $$X_{n_1}+\sum_{k=1}^\infty (X_{n_{k+1}}-X_{n_k})=\lim_{k\to\infty}X_{n_k}$$ is true? I have gotten to that part on my own, but I'm having trouble wrapping my head around why this is true.
The N-th partial sum of $X_{n_1}+\sum (X_{n_{k+1}}-X_{n_k})$ is $X_{n_{N+1}}$. Since the series converges almost surely (by Borel Cantalli Lemma) it follows that $\lim_{N\to \infty} X_{n_{N+1}}$ exists almost surely and we define this limit as $X$.