Why couldn't Baez-Duarte prove the Riemann Hypothesis?

608 Views Asked by At

Define \begin{equation} I_n=\int_{0}^{1/n} |U s_{n}(x)|^2 \mathrm{d}x \end{equation} where $Us_{n}(x)=\frac{1}{x}\sum_{j=1}^{n} \frac{\mu(j)}{j}\rho(jx), \mu$ denotes the Mobius function and $\rho(y)$ is the fractional part of $y$. We make three crucial observations: Firstly, since $0\leq \rho(jx)\leq jx$ for every $j\geq 1, x\geq 0$, note that $\lim_{x\rightarrow 0^+} \Big(\frac{\rho(jx)}{jx}\Big)<\infty$, thus the integrand of $I_n$ is well-defined for all $x\geq 0$. Secondly, the integral $I_n$ is defined over a finite range $(0, 1/n)$. Thirdly, by $2.14$ of Baez-Duarte we have $\lim _{n\rightarrow \infty} Us_{n}(x)=-\frac{\sin 2\pi x}{\pi x}$ hence $|Us_{n}(x)|<c/x$ for all $n$, where $c$ is some positive constant. Notice that these observations collectively imply that $I_n \leq C$ for every positive integer $n$ where $C$ is some positive constant, or equivalently, \begin{equation} \Big(\sum_{j=1}^n \mu(j)\Big)^2 = O(n), \end{equation} by identity 2.12 of Baez-Duarte that \begin{equation} I_n = \frac{1}{n}\Big(\sum_{j=1}^n \mu(j)\Big)^{2}. \end{equation}Since it is known that the RH is equivalent to the statement that $\Big(\sum_{j=1}^n \mu(j)\Big)^2 = O(n^{1+\epsilon})$ for any $\epsilon>0$, couldn't Baez-Duarte conclude this way that the RH is true ?

1

There are 1 best solutions below

31
On BEST ANSWER

Your argument for "This collectively implies that $I_n\le C$ for every positive integer $n$ where $C$ is some positive constant" is based on the following reasoning: If the functions $f_n$ converge pointwise to an integrable function $f$, then $\int_{\Omega_n} f_n$ is bounded by some function of $\int_{\Omega} f$ for some $\Omega$ which contains all $\Omega_n$.

This is simply false. For example the functions $f_n(x)=\frac{1}{nx}$ converge pointwise on $(0,1)$ to $f(x)=0$. Yet $\int_0^{1/n} f_n=\infty$.