Dice Roll Cumulative Sum

1.1k Views Asked by At

An $n$-sided dice is rolled until the cumulative sum of the top faces becomes $\ge n$. If $n$ is sufficiently large, what is the expected no. of rolls required? It appears that expected no. of rolls approach $e$. I don't know how to prove it.

This problem is a variation of this problem.

http://www.cseblog.com/2014/12/expected-number-of-attempts-broken.html

The solution is mentioned in the comment section. It is indeed $e$.

2

There are 2 best solutions below

2
On BEST ANSWER

For $n=1,2,\dots$ and $k\in\{1,\dots,n\}$ let $\mu_{n,k}$ denote the expected number of rolls needed to get a sum $\geq k$.

You are after expressions for $\mu_{n,n}$ and $\lim_{n\to\infty}\mu_{n,n}$, and we have:$$\mu_{n,k}=1+\frac{1}{n}\sum_{i=1}^{k-1}\mu_{n,i}$$


This equation appears if you work out the following:

If $X_{n,k}$ denotes the number of rolls needed to get a sum $\geq k$ and $D$ denotes the outcome of the first toss then:

$$\mu_{n,k}=\mathbb{E}X_{n,k}=\sum_{i=1}^{n}\mathbb{E}\left(X_{n,k}\mid D=i\right)\Pr\left(D=i\right)$$

Note here that $\mathbb{E}\left(X_{n,k}\mid D=i\right)=1+\mu_{n,k-i}$ if $i<k$ and $\mathbb{E}\left(X_{n,k}\mid D=i\right)=1$ otherwise.


This leads to:$$\mu_{n,k+1}=1+\frac{1}{n}\sum_{i=1}^{k}\mu_{n,i}=1+\frac{1}{n}\left(\mu_{n,k}+n\left(\mu_{n,k}-1\right)\right)=\left(1+\frac{1}{n}\right)\mu_{n,k}$$

And finally to:$$\mu_{n,n}=\left(1+\frac{1}{n}\right)^{n-1}$$

Then evidently: $$\lim_{n\to\infty}\mu_{n,n}=e$$

3
On

The way the cumulative sum behaves is like a random walk with steps of size $D$. (This is simplified. You might need to use Wald's Identity.) More easily, maybe, it's to say that we want to find the minimum number of throws before the expected sum is $N$.

$\mathbb{P}\left(\sum_{i=0}^{k}D_i < N \right)$ is the probability we're looking for.

Now we do

$$ \mathbb{P}\left(\sum_{i=0}^{k}D_i < \lim_{N \to \infty} N\right) = \mathbb{P}\left(\sum_{i=0}^{k} \frac{D_i}{\lim_{N \to \infty} N} < 1 \right) $$

If $D_i$ is a uniform distribution over $N$ values, we have

$$ \mathbb{P}\left(\sum_{i=1}^{k} \frac{D_i}{\lim_{N \to \infty} N} < 1 \right) $$

We need to show that $\frac{D_i}{\lim_{N \to \infty}} \sim \text{U}(0,1)$. That way the problem is easier to deal with.

$$ \mathbb{P}\left(D_i \leq a\right) = \frac{a}{N},\ a \in \{1,…,N\} $$

Dividing by $N$ in the probability we get

$$ \mathbb{P}\left(\frac{D_i}{N} \leq \frac{a}{N}\right) = \frac{a}{N},\ a \in \{1,…,N\} $$

Taking limits, $\lim_{N \to \infty} \frac{a}{N} = b,\ b \in (0,1]$. $\frac{D_i}{N}$ should be worked in a similar fashion. Maybe by showing the cumulative up to $a$ tends to $a$. Let $I_i = \lim_{N \to \infty} \frac{D_i}{N}$. Then

$$ \mathbb{P}\left(\sum_{i=1}^{k} I_i < 1 \right) = \frac{1}{k!} $$

Since the sum is a simple case of the Irwin-Hall distribution. If we define $S_J=\sum_{j=0}^{J}I_j$, we calculate the expected value as $\mathbb{E}[J] = \sum_{j=1}^\infty \mathbb{P}(J = j) \cdot j = \sum_{j=1}^\infty \frac{1}{\left(j-1\right)!}$ = e.

Once again, if there are flaws in the argument, I'd very much appreciate it if someone made note of them. This time I was more rigorous.