I was reading the article of L'Ecuyer on random number generation. The title of this article is "Uniform Random Number Generation". One of the proposed PRNGs there, is multiple recursive random number generators added with a constant.
My question is Why the distribution of a multiple recursive random number generators added with a constant is uniform? I couldn't find a proper reference on the subject.
These random number generators are defined by equation: $$ x_{n}\equiv a_1x_{n-1}+a_2x_{n-2}+\ldots+a_kx_{n-k}+a\pmod{m}$$
I'm looking specially for the case of $m=2^r$, which a lower hardware complexity is required for its implementation on digital systems.
Of course, no deterministic algorithm can generate truly random numbers. The goal is to find an algorithm that gives output values that 'behave as if' they are independent and identically distributed according to some distribution--usually $Unif(0, 1).$
By 'behave as if' one means that the pseudo-random numbers pass a large battery of benchmark tests for randomness, and give correct answers (within random error) to a large collection of problems known to be difficult to simulate. If a PRNG fails at one of these tasks, we know it's a 'bad' one. 'Good' PRNGs are ones that have not been proved bad after extensive testing. (Beyond the obvious, special attention is given to checking to see there is no multi-dimensional serial correlation, even in high dimensions.)
Some number-theoretic and other rules are known, telling kinds of algorithms and specific choices of constants (e.g, your $a_i$ and $m$) one must always avoid. Other rules are known for making 'potentially' promising PRNGs, but one never knows if a particular PRNG is useful until it is thoroughly vetted.
If you have access to R statistical software, a list of PRNGs currently regarded as 'good' is available by typing
? .Random.seedat the prompt>in the Session window. The default in R is the 'Mersenne-twister', which has a 'good' record after very extensive use.There is controversy whether digits far out into the decimal representation of transcendental numbers such as $e$ and $\pi$ are random, but the controversy is not of practical importance because it takes too much computing to get vast numbers of digits rapidly enough for modern simulations.
The best efforts towards truly random numbers have used physical random noise of various sorts. But these methods are currently too slow for extensive simulations. Perhaps somewhere along the line toward the development of quantum computers a way to generate truly random values at a very rapid rate will be discovered.