on a step of a proof of the Levinson density theorem

28 Views Asked by At

Let $n(r): \mathbb{R}\rightarrow\mathbb{N}$ be a monotone (increasing) function such that $\int_{1}^{r} \frac{n(u)}{u}du \leq \frac{1}{2}\log (r)+ A$ where $A$ is a certain constant. I should deduce that $n(r)=0$; any suggestion?

1

There are 1 best solutions below

0
On BEST ANSWER

Suppose $n(a) \neq 0 $ for some $a \geqslant 1$. In particular, since $n(a)$ is an integer, $n(a) \geqslant 1$. Then $n$ is nondecreasing, so if $u>a$, we have $n(u) \geqslant n(a) \geqslant 1 $. Hence $$ \int_1^r \frac{n(u)}{u} \, du \geqslant \int_a^r \frac{n(u)}{u} \, du \geqslant \int_a^r \frac{n(a)}{u} \, du \geqslant \int_a^r \frac{1}{u} \, du = \log{r}-\log{a}, $$ and for $r$ sufficiently large, this is larger than $\frac{1}{2}\log{r}+A$. Contradiction.