My attempt to follow Tatuzawa and Iseki strategy to get a bound for $\int_2^x \frac{dt}{\log t}-\pi(x)$, where $\pi(x)$ is the prime counting function

280 Views Asked by At

I don't know if this exercise is in the literature, where $Li(x)=\int_2^x\frac{dt}{\log t}$ is the logarithmic integral and $\pi(x)$ is the prime counting function

Question. Compute a good bound for $$ \left|Li(x)-\pi(x) \right| $$ by using summation methods, Möbius inversion, Shapiro's tauberian theorem, bounding integrals and the statement of the Prime Number theorem. Many thanks.

I know that it is an unsolved problem to prove that it is $O(\sqrt{x}\log x)$, but I don't know what of the following sum of terms it is impossible get as a previous big oh term $O(\sqrt{x}\log x)$. Notice that I don't know what cancellations could be in the sum of terms. Neiher I don't know what are the techniques employed in the deduction of the best bounds for previous difference. I am inspired in Apostol's words of section 4.11 (see bellow the reference) in a attempt, I say myself, to follow Tatuzawa and Iseki strategy.

By specialization with $F(x)=1$, thus $G(x):= \left[ x \right] \log x$ in Apostol's Theorem 4.17 (Apostol, Introduction to Analytic Number Theory (Springer)), dividing by $\log^2 x$, when $x\geq 2$ and after integrating $$Li(x)+\int_2^x\frac{\psi(t)}{\log^2 t}dt=\int_2^x\frac{\sum_{n\leq t}\mu(n) \left[ \frac{t}{n} \right] \log\frac{t}{n}}{\log^2 t}dt,$$ where $\psi(x)=\sum_{n\leq x}\Lambda(n)$ is the second Chebyshev's function with $\Lambda(n)$ the von Mangoldt function, and $\mu(n)$ is the Möbius function, with $ \left[ x \right] $ as we know the integer part function. I know to do the following computations, if there are no mistakes, applying the Prime Number theorem (I say the $O(1)$ term), for $\sum_{n\leq x}\mu(n) \left[ \frac{x}{n} \right] \log\frac{x}{n}$ that is equal to $$x\log x \left( O(1)+O \left( \sum_{n\leq x} \mu(n)\right) \right) -x\sum_{n\leq x}\frac{\mu(n)\log n}{n}+O \left( x\sum_{n\leq x} \mu(n)\log n\right) ,$$ I know that it is possible to get by summation bounds for more terms *.

By a second specialization of the cited theorem, $\tilde{F}(x)=x$ thus $\tilde{G}:=x\log x\sum_{n\leq x}\frac{1}{n}$, and thus we know that we can get this $\tilde{G}$ in an asymptotic identity as $x\log^2x+\gamma x \log x+O(\log x)$, where $\gamma$ is the Euler-Mascheroni constant, then by the theorem for $x\geq 2$

$$\frac{x}{\log ^2 x}+\frac{x}{\log^2 x} \left(\sum_{n\leq x} \frac{\Lambda(n)}{n}\right)=\frac{1}{\log^2 x}\sum_{n\leq x}\mu(n)\tilde{G} \left( \frac{x}{n} \right) , $$

and combining with the known $\pi(x)=\frac{x}{\log x}+O \left( \frac{x}{\log^2 x} \right) $, and the application of Shapiro's tauberian theorem to get the asymptotic for $\sum_{n\leq x}\frac{\Lambda(n)}{n}$, with the first specialization of the cited theorem $$ Li(x)-\pi(x) = \int_2^x\frac{\psi(t)}{\log^2 t}dt-\frac{x}{\log x}+\int_2^x\frac{\sum_{n\leq t}\mu(n) \left[ \frac{t}{n} \right] \log\frac{t}{n}}{\log^2 t}dt$$ $$\qquad\qquad\qquad\qquad-\frac{1}{\log^2 x}\sum_{n\leq x}\mu(n)\tilde{G} \left( \frac{x}{n} \right)+O \left( \frac{x}{\log ^2 x} \right). $$

To clarify my question I say in * and here that I want check with my functions or with different specializations, how much we can say about the difference $ Li(x)-\pi(x)$. I know that I can use the triangle inequality but I've said perhaps my computations were not the best. I would like to use the cited Theorem 4.17 to test it, with techniques of the kind that I've cited, on assumption of the statement of the Prime Number theorem. My question can be read as: what is the term that can not get as $O(\sqrt{x}\log x)$ in my computations or well in yourself, as you want.

1

There are 1 best solutions below

3
On BEST ANSWER

From your calculations you can't obtain the error $O\left(\sqrt{x}\log\left(x\right)\right) $. If you use the PNT in this form $$\pi\left(x\right)=\frac{x}{\log\left(x\right)}+O\left(\frac{x}{\log^{2}\left(x\right)}\right) $$ you already have an error bigger than $O\left(\sqrt{x}\log\left(x\right)\right) $. Note that for every $m\geq0 $ we have that $O\left(\frac{x}{\log^{m}\left(x\right)}\right) $ is bigger than $O\left(\sqrt{x}\log\left(x\right)\right) $. If we want obtain this error we have to assume RH (Riemann hypothesis). The first step is to prove that PNT is equivalent to the asymptotic $$\psi\left(x\right)\sim x,\, x\rightarrow\infty $$ and now from the truncated Perron's formula we can get (for more details see here, exercise $11$ and proposition $12$) that $$\psi\left(x\right)=x-\sum_{\left|\gamma\right|\leq T}\frac{x^{\rho}}{\rho}+O\left(\frac{x}{T}\log^{2}\left(xT\right)\right) $$ where $\rho=\beta+i\gamma $ are the non-trivial zeros of the Riemann Zeta function and $T>1$. So if we assume RH (i.e. $\beta=\frac{1}{2} $) we get $$\psi\left(x\right)-x\ll x^{1/2}\sum_{0<\gamma\leq T}\frac{1}{\gamma}+\frac{x}{T}\log^{2}\left(xT\right). $$ Note that the assumpion of RH is important here. Without this hypothesis we have to take $\max_{\beta}\left\{ x^{\beta}\right\} $ and then use a estimaton from the free zero region of the Riemann Zeta function, and so a bigger error. So the best bound we can get (so the best exponent) is when $\beta=1/2$. Now since from the Riemann-Von Mangoldt formula and partial summation we can prove that $$\sum_{0<\gamma\leq T}\frac{1}{\gamma}\ll O\left(\log^{2}\left(T\right)\right) $$ so $$\psi\left(x\right)-x\ll x^{1/2}\log^{2}\left(T\right)+\frac{x}{T}\log^{2}\left(xT\right) $$ and now if we take $T=x^{1/2} $ we get $$\psi\left(x\right)=x+O\left(x^{1/2}\log^{2}\left(x\right)\right).\tag{1} $$ The next thing to do is observe that, using Abel's summation, $$\pi_{1}\left(x\right):=\sum_{n\leq x}\frac{\Lambda\left(n\right)}{\log\left(n\right)}=\frac{\psi\left(x\right)}{\log\left(x\right)}+\int_{2}^{x}\frac{\psi\left(t\right)}{t\log^{2}\left(t\right)}dt\tag{2} $$ so, using $(1)$ in $(2)$, we get $$\pi_{1}\left(x\right)=\textrm{Li}\left(x\right)+O\left(\sqrt{x}\log\left(x\right)\right)+O\left(\int_{2}^{x}\frac{1}{\sqrt{t}}dt\right) $$ $$=\textrm{Li}\left(x\right)+O\left(\sqrt{x}\log\left(x\right)\right).\tag{3} $$ The last step is to observe that $$\pi_{1}\left(x\right)=\sum_{p^{m}\leq x}\frac{\log\left(p\right)}{m\log\left(p\right)}=\pi\left(x\right)+O\left(\pi\left(x^{1/2}\right)\right) $$ and since $\pi\left(x^{1/2}\right)\leq x^{1/2} $, so smaller than the error in $(3)$, we obtain $$\pi\left(x\right)=\textrm{Li}\left(x\right)+O\left(\sqrt{x}\log\left(x\right)\right).$$

Note: This proof is classical and it was taken from Davenport, "Multiplicative number theory", second edition.