A random sequence converging to $e$

40 Views Asked by At

Choose a random number between 0 and 1. Add it to 0. Choose another random number between 0 and 1 and add it to your previous sum. If your sum exceeds 1, stop, otherwise continue the same process. Store the number of choices you had to make. It seems that the average(total choices divided by total iterations) converges to $e$. Here is an equivalent Python code:

import random
Total_choices=0
Tries=1
choices=0
Total=0
iterations=1000
while Tries<=iterations:
    while Total<1:
        Total=Total+random.uniform(0,1)
        choices=choices+1
    Total=0
    Total_choices=Total_choices+choices
    choices=0
    Tries=Tries+1
print(Total_choices)
print(Total_choices/iterations)
    

Following are some examples of outputs that I got from which I hypothesized the above statement:

Iterations=100 average=2.62

Iterations=1000 average=2.752

Iterations=10000 average=2.7131

Is my hypothesis correct? If yes then why does this happen?

P.S if you can suggest a more appropriate title and I thought perhaps this question is more appropriate for this community instead of stackoverflow