I've been slowly working my way into the world of AI and its representations of random. Well as you can guess this occurs from a call to random() resulting in a value between $[0, 1]$ to create a desired random effect/action.
What confuses me is that once we get a random number back, lets say $0.10831288644112647$ people tend to use it for one random action (flip a coin) but I was thinking why can't we keep using this number to dictate more actions? Maybe even flip a lot of coins??
Let's not just use the value we get back but the decimal place values too. Currently I'd expect that each value has a $1$ in $10$ chance to be a $[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]$ and thus we could flip a coin for each value?
Just speculating but wouldn't we now be able to flip $18$ coins still maintaining its random result?
Continuing off this idea could we take the result from one decimal place with another? ex
if randomResult[0] && randomResult[1] == 3 then flip coin 19
Edit: The first value would not be $[0,\dots,9]$!
Mathematically, yes. We could generate $10$-digit random numbers and use each digit to determine one flip.
Unfortunately, this concept doesn't quite work. Consider generating $17$-digit random numbers (without trimming zeros, that is, $1.23030000$ not $1.2303$). Then we get the following results 100000 runs:
This is probably because not all numbers are $17$ digits long, so the padding which is added significantly biases towards zeros. We cannot ignore or trim the padding because then it is biased away towards zeros.
However, if we instead generated integers between $[0, 2^{n})$, we could use this to do coinflips (heads if the $n$th bit is $1$, or zero otherwise). It is the intricacies of floating point arithmetic which makes this method of random decimals unviable.