Choose a random number between $0$ and $1$ and record its value. Do this again and add the second number to the first number. Keep doing this until the sum of the numbers exceeds $1$. What's the expected value of the number of random numbers needed to accomplish this?
Choose a random number between $0$ and $1$ and record its value. Keep doing it until the sum of the numbers exceeds $1$. How many tries do we need?
40.2k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 4 best solutions below
On
Here is a (perhaps) more elementary method. Let $X$ be the amount of numbers you need to add until the sum exceeds $1$. Then (by linearity of expectation):
$$ \mathbb{E}[X] = 1 + \sum_{k \geq 1} \Pr[X > k] $$
Now $X > k$ if the sum of the first $k$ numbers $x_1,\ldots,x_k$ is smaller than $1$. This is exactly equal to the volume of the $k$-dimensional set:
$$ \left\{(x_1,\ldots,x_k) : \sum_{i=1}^k x_i \leq 1, \, x_1,\ldots,x_k \geq 0\right\}$$
This is known as the $k$-dimensional simplex. When $k = 1$, we get a line segment of length $1$. When $k = 2$, we get a right isosceles triangle with sides of length $1$, so the area is $1/2$. When $k=3$, we get a triangular pyramid (tetrahedron) with unit sides, so the volume is $1/6$. In general, the volume is $1/k!$, and so
$$ \mathbb{E}[X] = 1 + \sum_{k \geq 1} \frac{1}{k!} = e. $$
On
Here is another method. This is exactly the same technique Yuval has given above but with an added step that goes through computing the pmf first before computing the expectation. Hopefully, this will help you understand the problem from a slightly different angle.
Let us denote by $N$ the number of random variables we need to add for the sum to exceed $1$. We first find the distribution (probability mass function) of $N$. The easiest way to do this is by computing $P(N > n)$ for $n=1,2,3,\dots$. Once we know this, we can compute $P(N = n) = P(N > n-1) - P(N > n)$.
The event that $(N > n)$ in plain English says that the sum of the first $n$ uniform random variables did not exceed $1$. The probability of this event is therefore $P(N > n) = P(U_1 + U_2 + \dots + U_n < 1)$. This can be calculated by a standard multi-dimensional integral to be $\frac{1}{n!}$. If you don't want to use calculus, one can justify this result geometrically but integration is perhaps the most natural approach to evaluate this probability. Therefore, we have $P(N = n) = \frac{1}{(n-1)!} - \frac{1}{n!} = \frac{n-1}{n!}$ for $n=1,2,3\dots$.
Once we know the pmf of N, we calculate
$E(N) = \sum_{n=1}^{\infty} n P(N=n) = \sum_{n=1}^{\infty} \frac{n(n-1)}{n!} = \sum_{n=0}^{\infty} \frac{1}{n!} = e$.
On
The derivations involving the multidimensional integral for the probability of finding a sum smaller than 1, can be simplified by restoring permutation symmetry in the integration before evaluating it. The probability that the sum of $n$ uniformly distributed random numbers between 0 and 1 does not exceed 1 is given by:
$$P(n) = \int_{0\leq s_1\leq s_2\leq\cdots \leq s_n\leq 1} ds_1 ds_2\cdots ds_n $$
The integration variables here are the partial sums of the first $n$ random variables, which are therefore ordered as indicated. But because these are dummy variables, we can permute these variables in any way we like. This means that integrating over the variables without imposing the ordering will yield $n!$ times the value of the original integral. We thus have:
$$P_n = \frac{1}{n!} \int_0^1\int_0^1\cdots\int_0^1 ds_1 ds_2\cdots ds_n = \frac{1}{n!}$$
The expectation value then follows e.g. using the derivation given in this answer by Dinesh.
Assuming the numbers come from a uniform distribution over $[0,1]$ and that the trials are independent, here is an outline (this is example 7.4 4h. in Sheldon Ross' A First Course in Probability, sixth edition):
Let $X_i$ be the number obtained on the $i$'th trial.
For $0\le x\le1$, let $Y(x)$ be the minimum number of trials needed so that the sum of the $X_i$ exceeds $x$. Set $e(x)=\Bbb E [Y(x)]$.
Compute $e(x)$ by conditioning on the value of $X_1$: $$\tag{1} e(x)=\int_0^1 \Bbb E [ Y(x) | X_1=y]\, dy. $$
Here, use the fact that $$\tag{2}\Bbb E [ Y(x) | X_1=y] = \cases{1,& $y>x$\cr 1+e(x-y),& $y\le x $}.$$
Substitution of $(2)$ into $(1)$ will give $$\tag{3} e(x)=1+\int_0^x e(u)\,du. $$
Solve equation $(3)$ (by differentiating both sides with respect to $x$ first) for $e(x)$.
You wish to find $e(1)$.