"The digits used in artificial numbers are random while the real numbers aren't and their digits distribution is specific to their business"

153 Views Asked by At

Related to the question https://math.stackexchange.com/questions/1924178/tools-to-measure-the-nonrandomness-of-database, I'm somehow looking for some tools to measure the nonrandomness of databases. Igael gave me a hint I don't understand so far : "the digits used in artificial numbers are random while the real numbers aren't and their digits distribution is specific to their business".

Question : Could anyone be able to explain to me in details what it means exactly?

1

There are 1 best solutions below

1
On BEST ANSWER

Perhaps you might be interested in Benford's law:

Benford's law, also called the first-digit law, is an observation about the frequency distribution of leading digits in many real-life sets of numerical data. The law states that in many naturally occurring collections of numbers, the leading significant digit is likely to be small. For example, in sets which obey the law, the number 1 appears as the most significant digit about 30% of the time, while 9 appears as the most significant digit less than 5% of the time. By contrast, if the digits were distributed uniformly, they would each occur about 11.1% of the time. Benford's law also makes (different) predictions about the distribution of second digits, third digits, digit combinations, and so on.