Why does it take an average of $\mathrm{e}$ random numbers in $(0,1)$ to exceed $1$?

104 Views Asked by At

I recently came across this interesting fact:

  • Take some (pseudo)random numbers between $0$ and $1$.
  • Now sum these numbers and count how many are needed in order for the sum to be greater than $1$.
  • If you repeat the experiment $n$ times, you will see that, on average, you need $\mathrm{e}$ (Euler's number) numbers!

My mind is blown! But why? How does this work?