Simulate sum of N dice throws

557 Views Asked by At

For starters let me apologise if it isn't proper forum or if it was asked, I'm not very good at probability and what more I'm not familiar with proper english terms, so I could miss something.

I'm writing a program that sums n random values in some range, let's say [0;1). Currently I'm simply simulating each "throw" and summing results. This is fine if n is small, but unfortunately n can be in high milions. So my question is, how can I simulate it?

I assume that these sums have normal distribution. And that probabilities of all results sum to 1 (which seems obvious, but hey). So there simply MUST be some "inverted" function that will return sum given some random, uniformly distributed value. I think that it's domain will be [0;1] (well, probability) and values will be [0;n] (all possible values). It will also rise quickly, then will flatline at n/2 (?) and then will sharply rise to n.

Sorry if it makes no sense to you.

EDIT: to barack manos in a comment below: "normal distribution" (bell curve?) concerns distribution of possible results, "uniform distribution" concerns variable passed to hypothetical function that I wrote about.

1

There are 1 best solutions below

3
On BEST ANSWER

My understanding is that you are considering $n$ independent random variables $X_1, X_2, \ldots, X_n$ that each have a continuous uniform distribution on $[0,1)$ and their sum $Y = X_1+X_2+\ldots+X_n$. If $n$ is large then $Y$ will be approximately Gaussian distributed with mean $n/2$ and variance $n/12$ according to central limit theorem (since each contributing variable in the sum has mean $1/2$ and variance $1/12$).

So your question then becomes how to simulate a Gaussian variable with mean $n/2$ and variance $n/12$. If you're using some math library with a random number generator this may already be available as a library function. In case you can only generate standard Gaussian $Z$ with mean $0$ and variance $1$ you can simply get $Y = \frac{n}2 + \sqrt{\frac{n}{12}} \cdot Z$.

If all you have access to is a random number generator that gives you samples of a random variable with continuous uniform distribution on $[0,1)$ the "inverted" function you're asking for is the inverse of the cumulative distribution function for a Gaussian distribution, it can be computed with the inverse error function $\text{erfinv}$ or it close relative $\text{erfcinf}$.

In case it is important that the result is in $[0,n]$ you can use rejection (redo the sample) for results outside that interval. The Gaussian approximation will give results outside the interval with very low probability (for large $n$).

This is the inverse transform sampling method for generating standard Gaussian samples, see e.g. https://en.wikipedia.org/wiki/Inverse_transform_sampling for the method in general and a brief explanation of why it works.

An easier method to generate standard Gaussian samples is to use Box-Muller transform to generate two standard Gaussian samples from two uniform samples, see e.g. https://en.wikipedia.org/wiki/Box%E2%80%93Muller_transform