RH as asymptotic order of Liouville’s partial sum function

195 Views Asked by At

E. Landau famously proved in his 17 page PhD thesis from 1899 that $$ \lim_{n\to\infty}\frac{\sum_{k=1}^{n} \lambda(k)}{n^{\frac{1}{2} +\varepsilon}} =0, \; \forall \varepsilon>0$$

for the Liouville-lambda function $\lambda$, is equivalent to the Riemann Hypotheses (RH).

Defining $L(n):= \sum_{k=1}^{n} \lambda(k)$ (Liouville’s partial sum sequence) and using (Landaus) big-O notation we might write the RH then as $$ L(n)<\mathcal{O}(n^{\frac{1}{2}+\varepsilon}), \; \forall \varepsilon>0 \tag{1}.$$

As it is often stated this is not equivalent to $$L(n)=\mathcal{O}(n^{\frac{1}{2}}) \tag{2}$$

This is where I am stuck, as I fail to see the difference between (1) and (2), I would be very grateful if someone could outline/explain this difference. What would fit in-between the two without being either of them?

2

There are 2 best solutions below

1
On BEST ANSWER

When it's written as $L(n)=O(n^{1/2+\epsilon})$, keep in mind that $\epsilon$ can never be exactly zero. You've got to think of it as $n$ gets large. No matter what $\epsilon$ you pick, as long as it's positive, $n^{\epsilon}$ will still diverge as n tends to infinity. This is important when comparing $L(n)$ to $n^{1/2}$ as it tells you whether the ratio $\frac{L(n)}{n^{1/2}}$ is bounded, or if it diverges.

The reason a limit argument wouldn't work here is because of implied constants. The statement $\forall\epsilon > 0$ also means that there is a constant, say $c_{\epsilon}$ which depends on $\epsilon$ such that $|L(n)| < c_{\epsilon}n^{1/2+\epsilon}$ for all $n\in\mathbb{N}$. If you were to fix any $\epsilon$ greater than zero, then $c_{\epsilon}$ will be a constant since $\epsilon$ is a fixed number. However, if you let $\epsilon$ tend to zero, suddenly it's no longer fixed but instead a variable. This means $c_{\epsilon}$ is no longer a constant but could potentially approach infinity as $\epsilon$ tends to zero.

3
On

An example (although this is not specifically Liouville’s partial sum sequence, so I'm calling it $L_1(n)$ instead) which would fit in-between the two, but not "be" either, is

$$L_{1}(n) = n^{\frac{1}{2}}\ln(n) \tag{1}\label{eq1A}$$

Note that, technically, for all $\varepsilon \gt 0$, we actually have $L_1(n) \in \mathcal{O}(n^{\frac{1}{2}+\varepsilon})$, but not being "in" here means we also have $L_1(n) \in \mathcal{o}(n^{\frac{1}{2}+\varepsilon})$ (using the little-o notation), as indicated in \eqref{eq2A} below, i.e., this function also satisfies your initial limit equation of

$$\lim_{n\to\infty}\frac{L_{1}(n)}{n^{\frac{1}{2} +\varepsilon}} = 0 \,\; \forall \; \varepsilon \gt 0 \tag{2}\label{eq2A}$$

As indicated in your comment, what you're looking for is $n^{\frac{1}{2}}$ to be multiplied by a function which is unbounded, but which also grows slower than any fixed positive power, with $\ln(n)$ (e.g., as used in \eqref{eq1A}) being such one such function.

In particular, for any fixed $\epsilon \gt 0$ and $C_1 \gt 0$, there's always a $n_1$ where $\ln(n) \lt C_{1}n^{\epsilon}$ for all $n \ge n_1$, as well as there being no constant $C_2$ and $n_2$ where $\ln(n) \lt C_2$ for all $n \ge n_2$.