Asymptotic behaviour of a Dirichlet series as t goes to infinity Titchmarsh the theory of functions

73 Views Asked by At

I am stuck on a proof from a proposition on Titchmarsh's book "The Theory of Functions", at page 297 which says that for a Dirichlet series we have: $f(s)=O(|t|^{1-(\sigma-\sigma_0)+\epsilon})$ as $|t|\rightarrow\infty\quad \forall\sigma\in(\sigma_0,\sigma_0+1)$. The proof begins with the assumption that $\sigma_0=0$ so that we can obtain then the general case by changing the variable in the Dirichlet series. My question is: why we suppose first in the proof that $\sum a_n$ is convergent? We don't know generally if a Dirichlet series is convergent or not at $\sigma=\sigma_0$. The only thing that we know is that it is convergent for $\sigma=\sigma_0+\epsilon\quad \forall\epsilon>0$. I cannot understand how it turns out that $\sum a_n$ is convergent. Maybe it is a quite easy question, but I am really stuck on this.