Period of least significant bits of linear congruential random number generator

396 Views Asked by At

On a practice exam for a course on stochastic simulations I encountered the following question:

Show that the least significant $n$ bits must repeat with a period $2^n$ for a congruential random generator with a period $2^m$ where $m > n$.

I couldn't find an answer on how to do this anywhere. How can I shows this? Is it just that the $n$ bits can only generate $2^n$ numbers?

1

There are 1 best solutions below

1
On

In Chapter 3 of D. E. Knuth's The Art of Computer Programming, vol. 2, (1st edition), it is shown that the low-order $n$ bits repeat with a period of $2^n$ or less for any choice of the multiplier $a$. If you want to prove that the period is exactly $2^n$, then you may need to place restrictions on $a$. You may also need to place restrictions on the modulus $c$ because Knuth also says that if the modulus $c$ is $2^m$ (as in your comment), then the low-order bits are not quite as random as is possible with other choices of $c$.