If $a_k > 0\ \forall\ k \in \mathbb{N}\ $ and $r>0,\ $ prove that $\displaystyle \sum_{k=1}^\infty\frac{1}{a_k}$ converges $\iff \displaystyle \sum_{k=1}^\infty\frac{1}{a_k+r}$ converges.
$\implies\ $ is easy: $\frac{1}{a_k + r}<\frac{1}{a_k}\ \forall\ k \in \mathbb{N}.$
$\impliedby\ $ is more difficult. I provide my answer below, but certainly welcome other answers: there are often simpler solutions as I tend to over-complicate things.
We may use the limit comparison test. Note that, for either series to converge, we must have $$\frac{1}{a_k}\to 0\Leftrightarrow a_k\to\infty.$$ In particular, for all $N$, there exists some $k_0$ for which $a_k>N$ for all $k>k_0$. We claim that $$\lim_{k\to\infty}\frac{a_k}{a_k+r}=1.$$ Indeed, for any $\epsilon$, $$\left|1-\frac{a_k}{a_k+r}\right|<\epsilon$$ is equivalent to $a_k>N$ for some $N$, so there exists a $k_0$ for which this holds for all $k>k_0$. By the limit comparison test, the convergences of the two series are equivalent.