The rand function/algorithm - when does it begin to develop a pattern?

58 Views Asked by At

this question is rather general but I am sure a specific answerer or at-least a theoretical answerer can be provided on it. The rand function is a random number generator that runs on a seemingly random algorithm, seemingly because given a sufficient number of iterations one can see a pattern develop, or at-least so I am told. Now my question is how large would the number of iterations have to be so this would become realized and perhaps in which cases in science would this become a problem? The context in which I am normally using this function in, is Excel. Thank you for any explanation, in depth explanations would also be appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

In Excel 2003 Microsoft attempted to use the Wichmann-Hill generator. Their implementation was incorrect. They tried to fix it in Excel 2007 and again they did not get it right.

The period of the generator, which is what you are asking about, has an unknown length.

I hope the newer versions are better, but I wouldn't bet on it.

You can read more detail in the paper "On the accuracy of statistical procedures In Microsoft Excel 2007," in Computational Statistics and Data Analysis, by McCullough (2008).