Currently I'm working on a project where I need to simulate the decay of a number of isotopes after each second.
One way to do so is each second do a uniform random roll for each particle, and if it is smaller than the decay constant, then count it as a decay.
Although this model is accurate it is very inefficient for large numbers of particles.
My second approach was to use a binomial distribution to model the decay. This works fine when the number of particles is sufficiently large, but fails when the probability of a decay is small.
For instance lets take Tritium, with a half life of $3.88524\times10^{8}$ seconds. If we have $2,000,000$ particles of Tritium, we expect 1 decay about every 5 minutes. By running the first simulation, indeed this is the case; however for the second one a decay never happens (for as long as I ran the experiment).
Any advice on how to better model this phenomenon?
EDIT: here is the first code that simulates correctly
production = 0;
for(var i = 0; i < number; i++){
rand = Math.random();
if(rand < decay_constant){
production++;
}
}
Here is the code that simulates incorrectly
// p is the decay constant
var p = Math.log(2) / half_life;
var q = 1-p;
var mean = number*p;
var variance = number*p*q;
var std = Math.sqrt(variance);
prod = Math.round(numberGenerator.nextGaussian()*std+mean);
Here are some exact probability computations comparing Poisson, binomial, and normal distributions. I believe the Poisson distribution is best as long as the period of time considered is several orders of magnitude less than the half life. (For longer periods of time adjustments of the kind suggested in the Comment by @A.S. are helpful.)
Poisson model for one minute with binomial approximation. The number of decays $X$ observed in one minute should average $\lambda = 1/5.$ So we use $X \sim Pois(\lambda = 1/5).$ Because the number of particles is $n = 2 \times 10^6$ a binomial distribution with $p = \lambda/n = 10^{-7}$ has the same mean. For these parameters the fit is quite good. The following R code makes a table of the respective probabilities for $x = 0, 1, 2, 3$ and the left panel of the figure at the end (Poisson bars, binomial open circles).
Roughly speaking, a sufficient condition for a reasonable fit between binomial and Poisson is that $n$ be very large, $p$ very small, and $\lambda = np$ of moderate size. However, this Poisson distribution is so severely skewed that it does not make sense to try a normal approximation.
Poisson model for one hour with binomial and normal approximations. The number of decays $X$ in one hour has a mean 60 times as large, so we use $X \sim Pois(\lambda = 12).$ The corresponding binomial has the same $n$ as above and $p = \lambda/n = 6 \times 10^{-6},\,$ so that the means match.
As $\lambda$ increases, Poisson distributions become less skewed and a normal approximation becomes feasible. You can see, by the plot in the right hand panel below, that $\lambda = 12$ is not yet quite large enough for a really good approximation. The appropriate normal approximation is $Norm(\mu = \lambda,\, \sigma =\sqrt{\lambda}).$ The additional R code below finishes the plot.
As an example: for the probability of 10 or fewer events in an hour, the Poisson and binomial distributions both give 0.3472, and the normal approximation 0.3325 (all rounded to four places).
Note: I hope the R code is sufficiently transparent that you can follow it as much as you need to, and perhaps make minor modifications. R software is available free of charge and with easy installation at
www.R-project.org. There are Windows, Mac, and UNIX versions. I have used only the 'base' of R (no libraries) and put #-symbols to keep answers and comments from interfering with running the code as shown.