Why is $e$ an infinite summation of the likelihood of picking any one of $n$ arrangements?

112 Views Asked by At

Can someone provide an alternative explanation to why $e$ can be thought of as an infinite summation of the likelihood of picking any one arrangement out of $n$ arrangements?

$$\frac{1}{0!} + \frac{1}{1!} + \frac{1}{2!} + ... + \frac{1}{n!}$$

Though I am not averse to axiomatic arguments, I am seeking a "wordier" explanation or analogy; why should $e$ somehow be related to the idea of "the (summed) probability of choosing a particular arrangement of $n$ arrangements, as $n$ approaches infinity."

2

There are 2 best solutions below

2
On BEST ANSWER

To answer the question I suggested in the comments(!), here are a few probabilistic interpretations for $e$ and $e^{-1}$.

The two canonical probabilistic interpretations for $e^{-1}$ that I tend to see are the question of 'meeting expectations' and the derangements problem. The meeting expectations question is simple:

If I have a one-in-$n$ probability of doing something and I try it (independently) $n$ times, what's the chance that none of them succeeds?'

Since the probability of success is $1/n$, the probability of failure is $(1-1/n)$, and since the tries are independent the probability they all fail is $(1-1/n)^n$; as $n$ gets larger and larger, then, this approaches $e^{-1}$.

The derangements problem is similar but subtly different:

If I take $n$ different hats off of $n$ people, shuffle them randomly, and redistribute them, what's the probability that nobody gets their own hat back?.

Here it can be shown by various means (for instance, inclusion-exclusion) that this probability is $1-1/(1!)+1/(2!)-1/(3!)+1/(4!)+\ldots+(-1)^{n}/(n!)$, and so once again the limit as we approach infinitely many hats is $e^{-1}$.

While $e$ itself can't be interpreted as a probability per se (since it's larger than 1), it can be interpreted as a stopping time or expected value for a process. For instance, suppose that we follow this process:

Start with a deck of one card (an ace). Shuffle it randomly, and see whether the top card after it's shuffled is the ace. If so, add a different card (a deuce) to the deck. Repeat until we get a shuffle that doesn't have the ace at the top.

Then we can ask about how many cards the deck will have in it when we stop, on average. Now, the probability that we stop after $n$ steps is the probability that we've succeeded on all the previous steps ($1\cdot\frac12\cdot\frac13\ldots\cdot\frac1{n-1} = \frac1{(n-1)!}$) times the probability that we fail this time ($\frac{n-1}n$), or in other words $\frac1{n\cdot(n-2)!}$. (This requires $n\geq 2$, but it's clear to see that the probability we stop after the first shuffle is zero.) Since the number of cards in the deck then is $n$, we can compute the expectation of the number by multiplying that count by the stopping probability, and adding over all of the possible choices: $\sum_{n\geq 2}n\cdot\frac1{n\cdot(n-2)!}$ $=\sum_{n\geq 2}\frac1{(n-2)!}$ $=\sum_{m\geq 0}\frac1{m!}$ $=e$.

0
On

Not a proof but a handwaving argument:

If $f(x) = b^x$ (assume $b>0; b\ne 1$) then $f'(x)=\frac {b^{x+h}-b^h}h =b^x\lim \frac {b^h}{h}$.

$e$ can either be defined or derived to be the number so that if $f(x) =e^x$ then $f'(x) =e^x =f(x)$. (this also implies $\lim \frac {b^n}h = \ln x$.

(In actual fact this is a bit backwards as most calculus course define $\ln x := \int_1^x \frac 1t dt$ and $e^x$ as the inverse of that and $b^x = e^{x\ln b}$ and then prove for rational numbers it works exactly how we expect exponents to work and that $e^1 = \lim{1+\frac 1n}^n := e$. Intuitively completely backwards-- but rigorously sound.)

Now we have Newton approximation is that

We can estimate $f(x)$ wear $x$ is just slightly larger than $a$ as $f(x) = f(a)+ f'(a)(x-a)$. This makes sense if $f'(a)$ is considered a slope of a tangent line to the graph. THis just shows how the line would continue to shoot

If $f(x)$ is infinitely differentiable (which $e^x =(e^x)' = (e^x)''= ....$ has to be) then we can recursively and inductively apply Newton's approximation and create Taylor series.

For any $a$ the $f(x) = f(a) + \frac {f'(a)}{1!}(x-a) + \frac {f''(a)}{2!}(x-a)^2 + .....$.

And if we let $f(x) = e^x$ and $a = 0$ then $f^{k}(x) = e^x$ so $e^x = e^0 + \frac {e^0}{1!}(x-0) + \frac {e^0}{2!}(x-0)^2+...= 1 + \frac 1{1!}x + \frac 1{2!}x^2 + \frac 1{3!}x^3 +.....$

And if we take $x = 1$ we have $e =e^1 = \frac 1{0!} + \frac 1{1!} + \frac 1{2!} + .....$