Why is the number $e$ involved in Poisson's Distribution Formula?

714 Views Asked by At

I know the theory about the number $e$ in compound interest, etc. But I've been really thinking about this and I don't see what the number $e$ is doing here. Many thanks!

3

There are 3 best solutions below

4
On BEST ANSWER

Poisson distribution is a limit on Binominal distribution where amount of tests $n$ goes to infinity while maintaining constant gamma: $t = pn$

substitute this to the binominal equation with $$Bin(k;n,p) = \frac{n!}{k!(n-k)!} \cdot p^k \cdot (1-p)^{(n-k)}$$

Here if n goes to infinity: $$p^k = \frac{t^k}{n^k}$$

$$\frac{n!}{k!(n-k)!}\cdot \frac{t^k}{n^k} \rightarrow \frac{t^k}{k!} \cdot\frac{n!}{n^k(n-k)!}$$

Here in the denominator are k "n"-s instead of increasing factors up to n but as $n \rightarrow \infty$, it converges to 1, so the whole

$$\frac{n!}{n^k(n-k)!} \rightarrow 1$$ and

$$\frac{n!}{k!(n-k)!}\cdot \frac{t^k}{n^k} \rightarrow \frac{t^k}{k!} \cdot1 $$

then

$$(1-p)^{(n-k)} = \frac{\left(1-\frac{t}{n}\right)^n}{\left(1-\frac{t}{n}\right)^k}$$

$$\left(1-\frac{t}{n}\right)^n \rightarrow e^{-t}$$

$$\left(1-\frac{t}{n}\right)^k \rightarrow 1^k \rightarrow 1$$

together we will have the poisson formula

$$ \frac{t^k}{k!} e^{-t} $$

0
On

If you recall, $e$ appears in the formula for continuous compounding, and more generally, it tends to rear its head in continuous-time processes.

It is thus not too surprising that $e$ appears in the Poisson distribution mass function since the law of rare events tells us the Poisson distribution is a limiting binomial distribution as the number of trials goes to infinity (holding fixed the mean).

1
On

The Poisson distribution is the $n\to\infty$ limit of the Binomial distribution at fixed $\mu=np$. In terms of the probability mass function, your question is why$$\lim_{n\to\infty}\binom{n}{k}\left(\frac{\mu}{n}\right)^k\left(1-\frac{\mu}{n}\right)^{n-k}=e^{-\mu}\frac{\mu^k}{k!}.$$This can be proved as @GoldenRatio discussed; but, in the interests of advertising some machinery that's very useful in studying both distributional families, I'll instead work in terms of the probability-generating function, which for the binomial distribution is$$\left(1-\frac{\mu}{n}\right)^n\sum_{k=0}^n\binom{n}{k}\left(\frac{\mu z}{n-\mu}\right)^k=\left(1+\frac{\mu(z-1)}{n}\right)^n.$$The $n\to\infty$ limit is of course $\exp[\mu(z-1)]=\sum_{k\ge0}e^{-\mu}\frac{(\mu z)^k}{k!}$, making the probability mass $\frac{e^{-\mu}\mu^k}{k!}$ for $k\in\Bbb Z^\ast$, as expected.