A disproof of RH?

261 Views Asked by At

Does this paper of Sondow et. al, in which they propose a disproof of RH, have reasonable arguments ? The Riemann Hypothesis is not true

1

There are 1 best solutions below

6
On BEST ANSWER

On page 10 (paragraph disproof of the RH) they say that $\Re(\zeta'(s)/\zeta(s))>0$ for $\Re(s) \in (1/2,1)$ and $\Im(s)$ large, under the RH.

It is not true and this independently of the RH.

They messed with the $O(\log T)$ oscillation of $N(T)-\frac{T}{2\pi}\log(\frac{T}{2\pi})+\frac{T}{2\pi} $ at eq.26 as it is obviously not true that $\lim_{T\to \infty}f\ast g(T)\to 0$ for $f(u)=1_{u>2\pi}\log u, g(u)=\frac{u}{(\sigma-1/2)^2 + u^2)^2} $.


There is a proof that if $\zeta(s)$ has no zeros for $\Re(s)>\sigma$ then for any $c\in (\sigma,1)$, $\Re(\zeta'(s)/\zeta(s))$ changes of sign infinitely often on the line $\Re(s)=c$.

For $\Re(z)>0$ let $$F(z)=\int_c^{c+i\infty} (\frac{\zeta'(s)}{\zeta(s)}+\frac{e^{i(s-1)}}{s-1}) e^{iz(s-c)}ds$$ It is standard that this thing converges absolutely under the $c>\sigma$ restriction.

By the Cauchy integral theorem $$F(z)=\int_c^2 (\frac{\zeta'(s)}{\zeta(s)}+\frac{e^{i(s-1)}}{s-1}) e^{iz(s-c)}ds+\int_2^{2+i\infty}\frac{e^{i(s-1)}}{s-1} e^{iz(s-c)}ds+\int_2^{2+i\infty} \frac{\zeta'(s)}{\zeta(s)}e^{iz(s-c)}ds$$ $$ = G(z)-\int_2^{2+i\infty}\sum_{n\ge 2} \Lambda(n)n^{-s} e^{iz(s-c)}ds=G(z)-e^{-izc}\sum_{n\ge 2} \frac{\Lambda(n) n^{-2}}{iz-\log n}$$ where

  • $G(z)=\int_c^2 (\frac{\zeta'(s)}{\zeta(s)}+\frac{e^{i(s-1)}}{s-1}) e^{iz(s-c)}ds+\int_2^{2+i\infty}\frac{e^{i(s-1)}}{s-1} e^{iz(s-c)}ds$ is analytic for $\Re(z)>-1$

  • and $e^{-izc}\sum_{n\ge 2} \frac{\Lambda(n) n^{-2}}{iz-\log n}$ is analytic away from simple poles at the $-i\log n$

For $r >0$ let $$f(r)=\int_0^\infty \Re(\frac{\zeta'(c+it)}{\zeta(c+it)}) e^{-r t}dt=\Im(F(r))-\Im(\int_c^{c+i\infty} \frac{e^{i(s-1)}}{s-1} e^{iz(s-c)}ds)$$ If $\Re(\frac{\zeta'(c+it)}{\zeta(c+it)})$ didn't change of sign infinitely often then $\int_0^\infty \Re(\frac{\zeta'(c+it)}{\zeta(c+it)}) e^{-r t}dt$ would have a singularity at its abscissa of convergence, which is absurd as (the analytic continuation of) $f(r)$ is analytic for $r > -1$.