$N^{1/2}$ and randomness

97 Views Asked by At

I apologize if this question is overly vague, but part of the reason I am asking is because I don't know a more precise way of discussing these ideas.

To state a general question:

What, if any, connection is there between $O(N^{1/2})$ (or related ideas if big $O$ is not appropriate) and randomness?

Here are a few things which lead me to this vague (sorry!) question:

I was recently talking with an analytic number theorist and he stated something along the lines of

"Such and such a sequence $\{a_n\}$ has partial sums $\displaystyle\sum_{n<N}a_n$ has asymptotic behavior of $O(N^{1/2})$, so the sequence is essentially random in its distribution."

I would like to understand more what he is referring to.


EDIT: I understand that $O(\sqrt N)$ can occur for very deterministic sequences. That was never on question, so I apologize if my quote was misleading. However, I would like to understand what the reason the he would (admittedly imprecisely and incorrect in general) such a comment would be.


In other contexts, I have seen $N^{1/2}$ related to random walks and the term "root-mean-squared", but these are unknown waters for me.

There is also the appearance of $O(x^{1/2}\log(x))$ in the Riemann hypothesis as the best error bound for the prime number theorem.

Is this somehow related to the "randomness" $N^{1/2}$ implies in other contexts, with the $\log(x)$ term handling the non-random fact that primes become more sparse for $x>>0$?

Any information about these specific things would be greatly appreciated.

3

There are 3 best solutions below

2
On BEST ANSWER

Suppose $X_1, X_2, \dots X_n$ is a sequence of independent and identically distributed random variables with mean $0$ and standard deviation $\sigma$. Then the sum $S_n = X_1 + \dots + X_n$ has mean $0$ and standard deviation $\sqrt{n} \sigma$.

There are a large class of results in probability, the simplest being Chebyshev's inequality and the most famous being the central limit theorem, guaranteeing that random variables are not too far from their expected values, where "too far" is measured in terms of the standard deviation. So what the above tells you is that while you might have expected the sum above to vary as much as $O(n)$ from its mean (which occurs if the $X_i$ are not independent but are instead identical, say), in fact independence implies that it only varies about $O(\sqrt{n})$ from its mean. This is a fundamental observation in probability and statistics.

1
On

if you the average of $N$ independent random variables you get something that has standard deviation of size roughly $N^{-1/2}$. This is the central limit theorem. So the sum has error $O(N^{1/2}).$

The law of iterated logarithms might also be relevant in terms of the rate of convergence of a sequence to the average.

0
On

I'm not sure how that remark about an $O(N^{1/2})$ sequence being random came about, because you can easily come up with a deterministic (non-random) sequence such that the sum of the first $N$ terms is exactly $N^{1/2}$.

However square root does have some notable importance in randomness. If you have a random sample of $n$ i.i.d. samples from a distribution with standard deviation $\sigma$, then the standard deviation for the mean of these samples (trying to estimate the mean of the distribution) is $\sigma / \sqrt{n}$. For related reasons, if you take a random walk of $N$ steps either $+1$ or $-1$ on the real line, then the standard deviation of your final position relative to your initial position is $\Theta(\sqrt{N})$