proving several error terms for the divisor function $d(n)$

154 Views Asked by At

given the divisor function $d(n) = \#\{d|n\}$ I am trying to show the following:

  1. $d(n) = O(\sqrt{n})$
  2. $d(n) = O\Big(\exp\Big(\frac{c \log n}{\log \log n}\Big)\Big)$ for some constant $c > 0$
  3. $d(n) = O(n^{\epsilon})$ for any $\epsilon > 0$

I believe I have show one. I don't know where to start with two. Three I think I have made some progress.

My attempts are below:

  1. $d(n) = \sum_{d|n}1 = \sum_{ab = n}1 \leq 2\sum_{a \leq \sqrt{n}}1 + O(1) \leq 2 \sqrt{n} + O(1)$. Can I then just say this is $O(\sqrt{n})$?
  2. Let $\epsilon > 0$. In part three I show that each prime greater than $\exp(\frac{1}{\epsilon})$ contributes at most $1$ to the product. Therefore we consider small primes. By taylor expansion we can estimate $p_{j}^{\epsilon \alpha_{j}} \geq 1 + \epsilon \alpha_{j} \log p_{j}.$ Hence $\frac{\alpha_{j}+1}{p_{j}^{\epsilon \alpha_{j}}} \leq \frac{\alpha_{j}+1}{1 + \epsilon \alpha_{j} \log p_{j}}$. I am not sure what I can then bound this quantity by. If I find a suitable bound for this, I am thinking can I put this into the product defined in part three.
  3. If we consider $\frac{d(n)}{n^{\epsilon}}$ and write $n = p_{1}^{\alpha_{1}}p_{2}^{\alpha_{2}}\dots p_{k}^{\alpha_{k}}$ where $\alpha_{j} > 0$ and $p_{j}$ are distinct primes. Then we can rewrite the $n^{\epsilon}$ are a product of primes. By the fundamental theorem of arithmetic we can rewrite the divisor function as $\prod_{j=1}^{k}(\alpha_{j} + 1)$.

Therefore $\frac{d(n)}{n^{\epsilon}} = \prod_{j=1}^{k}\frac{\alpha_{j}+1}{p_{j}^{\epsilon \alpha_{j}}}$

We can now fix a prime $p_{j}$ and consider a single term $\frac{a_{j}+1}{p_{j}^{\epsilon \alpha_{j}}}$. For large $\alpha_{j}$ the denominator will dominate. For small $\alpha_{j}$ the epsilon in the denominator means that perhaps sometimes the numerator can dominate for sufficiently small $p_{j}$.

Splitting into cases I get:

  • Suppose $p_{j} \geq \exp(\frac{1}{\epsilon}).$ Then $p_{j}^{\epsilon \alpha_{j}} \geq \exp(\alpha_{j}) \geq 1 + \alpha_{j}$ by the taylor expansion of the exponential function. Therefore all large primes give a contribution of at most $1$ in the product.

For $p_{j} < \exp(\frac{1}{\epsilon})$, $ \frac{a_{j}+1}{p_{j}^{\epsilon \alpha_{j}}} \leq C_{p_{j}, \epsilon}$ (the constant doesn't depend on $a_{j}$ since $\frac{a + 1}{p_{j}^{\epsilon a}} \rightarrow 0$ as $a \to \infty$.) Therefore, each small prime gives a bounded contribution to the product.

Can I then say that the number of small primes is bounded (by finding a bound for $\exp(\frac{1}{\epsilon})$? Therefore the whole product is bounded.

Hence the result follows.

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

For the first question, the answer is yes. For the third question (i.e. the bold question), yes it works because you get $\frac{d(n)}{n^{\varepsilon}}\leqslant C_{\varepsilon}$ where $\displaystyle C_{\varepsilon}=\prod_{p<e^{1/\varepsilon}}C_{p,\varepsilon} $ does not depend of $n$. As for $2.$, I'll use $3.$. First notice that the bound $C_{\varepsilon}:=\sup\limits_{n\geqslant 1}\frac{d(n)}{n^{\varepsilon}}$ is reached for $\displaystyle n_{\varepsilon}:=\prod_p p^{\alpha_{p,\varepsilon}}$ where $\displaystyle\alpha_{p,\varepsilon}:=\left\lfloor \frac{1}{p^{\varepsilon}-1} \right\rfloor$. Indeed, if $f(\alpha):=\frac{\alpha+1}{p^{\alpha\varepsilon}}$, then $$ f(\alpha+1)\geqslant f(\alpha)\iff\frac{\alpha+2}{\alpha+1}\geqslant p^{\varepsilon}\iff\alpha\leqslant\alpha_{p,\varepsilon} $$ and $f$ reaches its maximum at $\alpha=\alpha_{p,\varepsilon}$. Now let $x_k:=\left(1+\frac{1}{k}\right)^{1/\varepsilon}$, then $$ \alpha_{p,\varepsilon}=k\iff \frac{1}{p^{\varepsilon}-1}-1<k\leqslant\frac{1}{p^{\varepsilon}-1}\iff x_{k+1}<p\leqslant x_k $$ for $k\geqslant 1$. Let $k_0:=\alpha_{2,\varepsilon}$, then there is no prime $p$ such that $p\leqslant x_{k_0+1}$ because $x_{k_0+1}<2$ so we have $$ n_{\varepsilon}=\prod_{k\leqslant k_0}\left(\prod_{x_{k+1}<p\leqslant x_k}p\right)^k $$ From this expression we can deduce the two following estimations : $$ \ln n_{\varepsilon}=\vartheta(x_1)+\mathcal{O}\left(x_1^{3/4}\right) \ \ \text{ and }\ \ \ln d(n_{\varepsilon})=(\ln 2)\pi(x_1)+\mathcal{O}\left(x_1^{3/4}\right) $$ Indeed, $\displaystyle\ln n_{\varepsilon}=\sum_{k\leqslant k_0}k(\vartheta(x_k)-\vartheta(x_{k+1}))=\sum_{k\leqslant k_0}\vartheta(x_k)$ and, using $x_2\leqslant x_1^{\frac{\ln 3}{\ln 2}-1}$, we have $$ \sum_{2\leqslant k\leqslant k_0}\vartheta(x_k)\ll k_0\vartheta(x_2)\ll k_0 x_2\ln x_2\ll x_2(\ln x_2)^2\ll x_1^{\frac{\ln 3}{\ln 2}-1}(\ln x_1)^2\ll x_1^{3/4} $$ because $\frac{\ln 3}{\ln 2}-1\approx 0.58\leqslant 0.75$. As for the other approximation, we have $$ \ln d(n_{\varepsilon})=\sum_{k\leqslant k_0}\ln(k+1)(\pi(x_k)-\pi(x_{k+1}))=\sum_{k\leqslant k_0}\ln\left(1+\frac{1}{k}\right)\pi(x_k) $$ and $$ \sum_{2\leqslant k\leqslant k_0}\ln\left(1+\frac{1}{k}\right)\pi(x_k)\ll\frac{k_0 x_2}{\ln x_2}\ll x_1^{3/4} $$ using the same arguments. Now let $R(x)$ be such that $\pi(x)-{\rm li}(x)\ll R(x)$ and $\vartheta(x)-x\ll R(x)$, then $$ \ln d(n)\leqslant \ln C_{\varepsilon}+\varepsilon\ln n\leqslant\ln d(n_{\varepsilon})-\varepsilon\ln n_{\varepsilon}+\varepsilon\ln n\leqslant(\ln 2)\pi(x_1)-\varepsilon\vartheta(x_1)+\varepsilon\ln n+\mathcal{O}\left(x_1^{3/4}\right)$$ We then use the bound $R(x)\gg x^{4/5}$ and we obtain $$ \ln d(n)\leqslant (\ln 2){\rm li}(x_1)-\varepsilon x_1+\varepsilon\ln n+\mathcal{O}(R(x)) $$ In order to cancel the first order terms, we chose $\varepsilon:=\frac{\ln 2}{\ln\ln n}$ so that $x_1=2^{1/\varepsilon}=\ln n$. We thus have $\ln d(n)\leqslant(\ln 2){\rm li}(\ln n)+\mathcal{O}(R(\ln n))$ and, using the bound $R(x)\ll\frac{x}{(\ln x)^2}$, we finally get $$ \ln d(n)\leqslant \frac{(\ln 2)\ln n}{\ln \ln n}+\mathcal{O}\left(\frac{\ln n}{(\ln \ln n)^2}\right)\underset{n\rightarrow +\infty}{\sim}\frac{(\ln 2)\ln n}{\ln \ln n} $$