Apostol ANT chapter 13 Question 9

72 Views Asked by At

Given $L(s,\chi)$ has a zero of order $m\ge1$ at $s=1+it$, prove that for this t we have:

(a) $\frac{L'}{L}(\sigma+it,\chi)=\frac{m}{\sigma-1}+O(1)$ as $\sigma \to 1^{+}$

and

(b) there exists an integer $r \ge 0$ such that

$\frac{L'}{L}(\sigma+2it,\chi^{2})=\frac{r}{\sigma-1}+O(1)$ as $\sigma \to 1^{+}$, except when $\chi^{2}=\chi_{1}$, the principal character, and $t=0$.

Part (a) follows easily by differentiating $L(s,\chi)$ with respect to $s$ and letting $\sigma \to 1^{+}$.

I cannot see how to get part (b). Could you please point me in the right direction?