Let $(X_n)_{n=1}^{\infty}$ be a sequence of independent, mean-zero random variables with $\sum_{n=1}^{\infty}\text{Var}(X_n) < \infty$.
This is about an intermediate step in the proof of the Khintchine-Kolmogorov Convergence Theorem. Denote $S_j = X_1 + \cdots + X_j$. I have that for any $h > 0$, $$\mathbb{P}\left(\sup_{i \geq 1}|S_{k + i} - S_k| \geq h \right) \leq \dfrac{1}{h^2}\sum_{j=k+1}^{\infty}\sigma^2_j$$ where $\sigma^2_j$ denotes the variance of $X_j$. How do you make the leap from this to $$\mathbb{P}\left(\sup_{n, k \geq N}|S_{n} - S_k| \geq h \right) \leq \dfrac{2}{h^2}\sum_{j=N+1}^{\infty}\sigma^2_j$$ After sorting through nearly a dozen measure-theoretic probability books, I haven't seen this exact leap being made, but Durrett is very close.
Here's my attempt: let's write $$\sup_{n, k \geq N}|S_n - S_k| = \sup_{n, k \geq N}|S_n - S_{k +i} + S_{k+i} - S_k| \leq \sup_{n, k \geq N}|S_{k + i} - S_n| + \sup_{n, k \geq N}|S_{k + i} - S_k|\text{.}$$ I'm not exactly sure how the $\sup_{n, k \geq N}$ get translated to $\sup_{i \geq 1}$ here, assuming this is correct.
You can prove he inequality with a bigger constant on the right but that is good enough for proving the theorem.
$\sup_{n,k \geq N} |S_n-S_k| \geq h$ implies $\sup_{n \geq N} |S_n-S_N| \geq h/2$ or $\sup_{k \geq N} |S_k-S_N| \geq h/2$. Form this you get the required inequality with RHS repaced by $2\frac 1 {(h/2)^{2}} \sum\limits_{j=N}^{\infty} \sigma_j^{2}$ which means $\frac 2 {h^{2}}$ gets replaced by $\frac {8} {h^{2}}$.
[$\sup_{n \geq N} |S_n -S_N| $ is same as $\sup_{i \geq 1} |S_{N+i}-S_N|$].