Regarding deduction related to order of π(x) which can be proved assuming Chebysheff theorem( doubt in integrals)

59 Views Asked by At

While self studying analytic number theory from Introduction to sieve methods and it's applications by M Ram Murthy and Alina Carmen,I have a doubt in text.

Authors write that Assuming Chebysheff theorem whose statement is - There exist positive constants A and B such that Ax < $\theta(x) $ < Bx.

Now authors says by partial summation, this implies the bound on π(x) ie π(x) = O($\frac {x} { log x } $ ) .

What I have done - > In Abel Summation formula I take a(n) = b(n) / log (x) , b(n) = 1 if n is prime and 0 otherwise. f(n) = 1/logx . So RHS becomes $\theta(x) $ / log x + $\int_{2}^x \frac{\theta(t) } { t log^2(t) } dt $ . Now, using $\theta(x) $ ~ x as x ->$\infty $ . And writing $\int_{2}^x$ = $\int_{2}^y$ + $\int _{y}^x$ and y - >$\infty$ .

But now on RHS I have O( x/ logx) + O ( $\int_{2}^{\infty} \frac{1} { log^2(t) } dt $ )- M ×$\int_ {x} ^{\infty} \frac{1} {log^2(t) } dt $ .

Now there are two problems about which I am unable to think 1. How to prove $\int_{2}^{\infty} \frac {1} {log^2(t) } dt $ is convergent? 2.How to evaluate integral $\int_{x}^{\infty} \frac {1} { log^2(t) } dt $ ?

Can someone please explain!!

1

There are 1 best solutions below

5
On BEST ANSWER

You don't need anything else than $$\pi(x)= \pi(x)-\pi(x^{1/2})+O(x^{1/2})$$ $$\frac{\theta(x)-\theta(x^{1/2})}{\log x}\le \pi(x)-\pi(x^{1/2})\le \frac{\theta(x)}{\log x^{1/2}}$$ to get $$\frac{\theta(x)}{x}\in [A,B]\implies \frac{\pi(x)}{x/\log x}\in [a,b]$$

Once we know that $\theta(x)=O(x)$ then the partial summation is $$\pi(x)=\sum_{n\le x}\frac{\theta(n)-\theta(n-1)}{\log n}=\frac{\theta(x)}{\log x}+\sum_{n\le x-1} \theta(n)(\frac1{\log n}-\frac1{\log (n+1)})$$ $$ = \frac{\theta(x)}{\log x}+\sum_{n\le x} O(n \frac{\log(n)-\log(n+1)}{\log^2 n})=\frac{\theta(x)}{\log x}+\sum_{n\le x} O(n \frac{1/n}{\log^2 n})$$ $$=\frac{\theta(x)}{\log x}+O(\frac{x}{\log^2 x})$$