How to show that the Riemann hypothesis, random walks and the Möbius function are related or even equivalent?
I was reading the paper Randomness and Pseudorandomness by Avi Wigderson, but to me the most interesting thing wasn't discussed there.
How to show that the Riemann hypothesis, random walks and the Möbius function are related or even equivalent?
I was reading the paper Randomness and Pseudorandomness by Avi Wigderson, but to me the most interesting thing wasn't discussed there.
On
The fact that the Riemann Hypothesis and the Mobius function are related has its origin in the formula $$\frac{1}{\zeta(s)} = \sum_{n=1}^\infty \mu(n)n^{-s}.$$
The relation to random walks is that the Riemann Hypothesis implies that the cumulative sum of the Mobius function $$ M(x) =\sum_{n\le x}\mu(n)$$ has random walk-like asymptotics $M(x) = O(x^{1/2+\epsilon}).$ The naive guess is that if the Mobius function were a random walk, the order of growth would be $\sqrt{x\log\log(x)}$ (the law of iterated logarithm) so the Riemann hypothesis gets us fairly close to this.
The identity relating the zeta function and the Mobius function is straightforward and can be understood by a person without much background with little work. It has to do with Euler's identity $$ \zeta(s) = \prod_p\frac{1}{1-p^{-s}}.$$ You can find derivations of this identity in a lot of places. Basically you expand the factor in a geometric series and multiply out. The identity to the sum form $ \zeta(s) = \sum_{n=1}^\infty n^{-s}$ then follows from the fundamental theorem of arithmetic. Then if you flip over Euler's identity and multiply it out, thinking about which terms are present and whether they have plus and minus signs will lead you to the identity $$ \frac{1}{\zeta(s)} = \prod_p (1-p^{-s}) = \sum_{n=1}^\infty \mu(n) n^{-s}$$ A good for-the-layman treatment of Euler's identity as well as more advanced stuff and connections to randomness can be found in Prime Obsession by (the regrettably somewhat disgraced) Derbyshire.
But the latter fact relating the Mobius function to a random walk requires some more heavy lifting.
In brief, the $1/2$ power and the fact that the zeros have $s=1/2$ are intimately related. The origin is that cumulative functions $F(x) = \sum_{n\le x}f(n)$ are related to Dirichlet series $\tilde f (s)=\sum f(n)n^{-s}$ by a Mellin transform $$ \frac{\tilde f(s)}{s} = \int_1^\infty x^{-(s+1)}F(x)dx$$ which can then be inverted to $$ F(x) = \frac{1}{2\pi i}\int_{a-i\infty}^{a+i\infty}\frac{\tilde f(s)}{s}x^sds.$$
If you use Cauchy's theorem to do the transform integral, you will see that the poles of $\tilde f(s)$ with largest real part contribute the leading order terms in the asymptotic series for $F(x)$. Since here we have $\tilde f = \frac{1}{\zeta(s)}$ we see that, assuming the Riemann Hypothesis, the largest poles of the integrand (i.e. the zeros of the zeta function) have real part $s=1/2$ so contribute a leading term $\sim x^{1/2}$ with oscillations from the imaginary parts
This of course sweeps a lot of details under the rug, but gives a basic outline of the relationship.
To see demonstrations of these facts one can consult an introductory treatment of analytic number theory. (Notes by Hildebrand are available free online and are quite good.)
These three concepts are not equivalent as such, but there is an interesting connection.
We can think of a simple symmetric random walk on $\mathbb{Z}$ as a sequence of random variables $Z_{1},Z_{2},...$ where each $Z_{k}$ takes the value $+1$ or $-1$, each with probability $1/2$. We can then define $S_{n}=\sum_{k=1}^{n}Z_{k}$ to be the position after $n$ steps of the random walk. One can then ask about what can be said about $E(|S_{n}|)$, the expected distance of the walk from the origin after $n$ steps. As is noted in the article, $E(|S_{n}|)$ is "about $\sqrt{n}$". More precisely $$\lim_{n\rightarrow\infty}\frac{E(|S_{n}|)}{\sqrt{n}}=\sqrt{\frac{2}{\pi}}$$
In particular, we can see that $E(|S_{n}|)=O(n^{\epsilon+\frac{1}{2}})$ for any $\epsilon>0$.
How does this relate to the Riemann Hypothesis? Well there is a formulation of the Riemann Hypothesis which bares a striking resemblance to the above statement. The idea is outlined in the article in the following way. Instead of using random variables $Z_{k}$ to determine each step, use $\mu(k)$. Here $\mu$ is the Mobius function, which is defined by setting $\mu(k)=+1$ if $k$ is a product of an even number of distinct primes, $\mu(k)=-1$ if $k$ is a product of an odd number of distinct primes, and $\mu(k)=0$ otherwise. Again we can think of using $\mu$ to tell us how to walk along $\mathbb{Z}$.
Now define the Mertens function $M(n)=\sum_{k=1}^{n}\mu(k)$. This function tells us where we are after $n$ steps using this method. Note that, unlike with the random walk, this walk is deterministic so we are looking at $M(n)$ directly and not some expectation. So, as with the random walk, what can we say about $M(n)$? Well, as is hinted at in the article, it turns out that the Riemann Hypothesis is equivalent to the conjecture that $$M(n)=O(n^{\epsilon+\frac{1}{2}})$$ holds for all $\epsilon>0$. Compare this with what was said about the random walk.