Consider the series $$\sum_{n=1}^\infty n e^{-n \varepsilon}$$ For $\varepsilon \leq 0$, it diverges. For $\varepsilon > 0$, it converges and equals $$\frac{e^\varepsilon}{(e^\varepsilon - 1)^2}$$ which has the asymptotic expansion $$\frac{1}{\varepsilon^2} - \frac{1}{12} + \frac{\varepsilon^2}{240} - \frac{\varepsilon^4}{6048} + O(\varepsilon^6)$$ which has a constant term of $-\frac{1}{12}$.
Let $p_n$ denote the $n$th prime number. $p_n \sim n \log n$ by the prime number theorem. Let \begin{align} f_1(\varepsilon) &= \sum_{n=1}^\infty p_n e^{-n \varepsilon} \\ f_2(\varepsilon) &= \sum_{n=1}^\infty p_n e^{-p_n \varepsilon}. \end{align} These series converge for all $\varepsilon > 0$. Do they have asymptotic expansions in $\varepsilon$? If so, is it possible to extract their constant terms?
[Related question on MO. For more on smoothed sums, see reference 1.]
References:
Heuristically, using the prime number theorem, \begin{align*} & f_2 (\varepsilon ) = \sum\limits_{n = 1}^\infty {p_n \,\mathrm{e}^{ - p_n \varepsilon } } = \int_2^{ + \infty } {t\,\mathrm{e}^{ - t\varepsilon } \mathrm{d}\pi (t)} \sim \int_2^{ + \infty } {\frac{{t\,\mathrm{e}^{ - t\varepsilon } }}{{\log (t)}}\mathrm{d}t} \\ & = \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\int_{2\varepsilon} ^{ + \infty } {\frac{{s\,\mathrm{e}^{ - s} }}{{1 + \log (s)/\log (1/\varepsilon )}}\mathrm{d}s} \\ & \sim \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\sum\limits_{n = 0}^\infty {\frac{{( - 1)^n }}{{\log ^n (1/\varepsilon )}}\int_{2\varepsilon} ^{ + \infty } {s\,\mathrm{e}^{ - s} \log ^n (s)\,\mathrm{d}s} } \\ & \sim \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\sum\limits_{n = 0}^\infty {\frac{{( - 1)^n }}{{\log ^n (1/\varepsilon )}}\int_0^{ + \infty } {s\,\mathrm{e}^{ - s} \log ^n (s)\,\mathrm{d}s} } \\ & = \frac{1}{{\varepsilon ^2 \log (1/\varepsilon )}}\left( {1 + \frac{{\gamma - 1}}{{\log (1/\varepsilon )}} + \frac{{\frac{{\pi ^2 }}{6} + \gamma ^2 - 2\gamma }}{{\log ^2 (1/\varepsilon )}} + \frac{{\frac{{\pi ^2 }}{2}(\gamma - 1) + \gamma ^2 (\gamma - 3) + 2\zeta (3)}}{{\log ^3 (1/\varepsilon )}} + \ldots } \right) \end{align*} as $\varepsilon \to 0^+$. The error should be exponentially small compared to any of the terms in the series. Here $\pi(t)$ is the prime counting function, $\gamma$ is the Euler–Mascheroni constant, and $\zeta(3)$ is Apéry's constant. For a numerical example consider $\varepsilon =5 \cdot 10^{-5}$. With this value, $f_2(\varepsilon)\approx 1.8512832 \cdot 10^{10}$, whereas the approximation, with the terms neglected after order $\log ^{-3} (1/\varepsilon )$, gives $\approx 1.8524638\cdot 10^{10}$.
Note that the coefficients in the asymptotic expansion may be obtained via the exponential generating function $$ \exp\! \bigg( {(\gamma - 1)z + \sum\limits_{n = 2}^\infty {\frac{{\zeta (n) - 1}}{n}z^n } } \bigg) = 1 + \frac{{\gamma - 1}}{{1!}}z + \frac{{\frac{{\pi ^2 }}{6} + \gamma ^2 - 2\gamma }}{{2!}}z^2 + \ldots , $$ with $|z|<2$. This relation may be proved using the observation $ \int_0^{ + \infty }s\,\mathrm{e}^{ - s} \log ^n (s)\,\mathrm{d}s = \Gamma ^{(n)} (2) $. The latter also leads to the simple asymptotic approximation $$ f_2 (\varepsilon ) \sim \int_0^1 {\frac{{\Gamma (1 + s)}}{{\varepsilon ^{1 + s} }}{\rm d}s} $$ as $\varepsilon \to 0^+$.
Addendum. I shall show that the error coming from the first approximating step can be absorbed into any of the error terms of the final asymptotic expansion. We use the prime number theorem in the form $$ \pi (t) = \int_2^t {\frac{{{\rm d}s}}{{\log (s)}}} + R(t),\quad R(t) = \mathcal{O}\!\left( {\frac{t}{{\log ^{N + 2} (t)}}} \right) $$ where $N$ is an arbitrary fixed positive integer. Then \begin{align*} f_2 (\varepsilon ) & = \int_2^{ + \infty } {\frac{{t\,\mathrm{e}^{ - t\varepsilon } }}{{\log (t)}}{\rm d}t} + \int_2^{ + \infty } {t\,\mathrm{e}^{ - t\varepsilon } {\rm d}R(t)} \\ & = \int_2^{ + \infty } {\frac{{t\,\mathrm{e}^{ - t\varepsilon } }}{{\log (t)}}{\rm d}t} + \int_2^{ + \infty } {(\varepsilon t - 1)\,\mathrm{e}^{ - t\varepsilon } R(t)\,{\rm d}t} + \mathcal{O}(1). \end{align*} But \begin{align*} \int_2^{ + \infty } {(\varepsilon t - 1)\,\mathrm{e}^{ - t\varepsilon } R(t){\rm d}t} & = \mathcal{O}(1)\int_2^{ + \infty } {\frac{{t(\varepsilon t - 1)\,\mathrm{e}^{ - t\varepsilon } }}{{\log ^{N + 2} (t)}}{\rm d}t} \\ & = \mathcal{O}(1)\frac{1}{{\varepsilon ^2 \log ^{N + 2} (1/\varepsilon )}}\int_{2\varepsilon}^{ + \infty } {\frac{{s(s - 1)\,\mathrm{e}^{ - s} }}{{(1 + \log (s)/\log (1/\varepsilon ))^{N + 2} }}{\rm d}s} \\ & = \frac{1}{{\varepsilon ^2 \log ^2 (1/\varepsilon )}}\mathcal{O}\!\left( {\frac{1}{{\log ^N (1/\varepsilon )}}} \right) \end{align*} and the claim follows.