Expected value of $M$ and $\lim_{n\rightarrow\infty}\frac{1}{n}E[M]$ of $n$-sided dice given $k\in\mathbb{N}$ throws

51 Views Asked by At

Let the random variable $M$ be the largest rolled number after $k\in\mathbb{N}$ throws on an $n$-sided fair dice. The task is to compute $E[M]$ and $\lim_{n\rightarrow\infty}\frac{1}{n}E[M]$. Now since $M$ only takes values in $\mathbb{N}$, we first notice that $M$ can be written as $\sum_{m=1}^{\infty}1_{X\ge m}$. Now follows: $$ E[M]=\int_{\Omega}MdP=\int_{\Omega}\sum_{m=1}^{\infty}1_{M\ge m}dP=\sum_{m=1}^{\infty}P(M\ge m)=\sum_{m=1}^{n}P(M\ge m) $$where $\Omega=\{1,...,n\}^k$ is the probability space. We have that:$$ P(M\ge m) =1-P(M<m)=1-(1-\frac{1}{n})^k $$is the probability of hitting $m$ at least once given $k$ throws. If you plug that into the sum: $\sum_{m=1}^{n}P(X\ge m)=\sum_{m=1}^{n}1-(1-\frac{1}{n})^k=n-n(1-\frac{1}{n})^k$

It looks weird to me that we have $\lim_{n\rightarrow\infty}\frac{1}{n}E[M]=0$. There must be some misunderstanding of me, since I expected a more interesting result computing the limit. I hope someone can clarify me!

2

There are 2 best solutions below

1
On BEST ANSWER

Your reasoning is correct but $P(M<m)=(1-\frac{1}{n})^k$ is wrong.

Actually $$P(M<m)= \prod P(X_i<m) = \left(\frac{m-1}{n}\right)^k \tag 1$$

$$E[M]=n-\sum_{m=1}^n \left(\frac{m-1}{n}\right)^k = n- \sum_{t=0}^{n-1} (t/n)^k \tag 2 $$

The last sum tends, as $n \to \infty$, to the Riemman sum of $\int_0^1 x^k = 1/(k+1)$

Hence

$$E[M] \to n - \frac{n}{k+1}= n \frac{k}{k+1} \tag 3$$

and $$\frac{E[M]}{n} \to \frac{k}{k+1} \tag 4$$

You could guess that your result $\frac{E[M]}{n} \to 0$ could not be right, by considering the case $k=1$: here, clearly $E[M]/n= (n+1)/2n \to 1/2$

The correct result above $(4)$ could be guessed by considering that when $n$ grows the (normalized) distribution approximates a continouos uniform variable on $[0,1]$. And the expected maximum of $k$ occurences of such variable is $k/(k+1)$ (ref)

0
On

Let $M_k$ be the largest number rolled after $k$ rolls of the die, then $\{M_k:k\in\mathbb N\}$ is a Markov chain on $\{1,2,3,4,5,6\}$ with transition probabilities $$ p_{ij} = \begin{cases} i/6,& i=j\\ 1/6,& i<j\\ 0,& \text{otherwise}. \end{cases} $$ Computing $P^k$ where $P$ is the transition matrix yields $$ P^k = \begin{bmatrix} 6^{-k}&3^{-k}-6^{-k}&2^{-k}-3^{-k}&\left(\frac23\right)^{k}-2^{-k}& \left(\frac56\right)^k-\left(\frac23\right)^{k} & 1-\left(\frac56\right)^k\\ 0&3^{-k}&2^{-k}-3^{-k}&\left(\frac23\right)^{k}-2^{-k}& \left(\frac56\right)^k-\left(\frac23\right)^{k} & 1-\left(\frac56\right)^k\\ 0&0&2^{-k}&\left(\frac23\right)^{k}-2^{-k}& \left(\frac56\right)^k-\left(\frac23\right)^{k} & 1-\left(\frac56\right)^k\\ 0&0&0&\left(\frac23\right)^{k}& \left(\frac56\right)^k-\left(\frac23\right)^{k} & 1-\left(\frac56\right)^k\\ 0&0&0&0&\left(\frac56\right)^k & 1-\left(\frac56\right)^k\\ 0&0&0&0&0&1 \end{bmatrix}. $$ To compute the expectation of $M_k$, we multiply $k$ by the sum of the $k^{\mathrm{th}}$ column of $P^k$: \begin{align} \mathbb E[M_k] &= 1\cdot 6^{-k} + 2\cdot(3^{-k}-6^{-k}+3^{-k} ) + 3 \cdot\left(\left(\frac23\right)^k-2^{-k}+\left(\frac23\right)^k-2^{-k}+\left(\frac23\right)^k\right)\\ &\ + 4\cdot\left(4\cdot \left(\left(\frac23\right)^k-2^{-k}\right) + \left(\frac23\right)^k \right) + 5\cdot\left(4\cdot\left(\left(\frac56\right)^k-\left(\frac23\right)^k\right)+\left(\frac56\right)^k \right)\\ &\ + 6\cdot\left(5\cdot\left(1 - \left(\frac56\right)^k \right)+1 \right)\\ &= 6^{-k} \left(2^{k+1}+3^{k+1}+4^{k+1}+5^{k+1}+6^{k+1}+1\right) \end{align} From this it follows that $\lim_{k\to\infty}\mathbb E[M_k]=6$.

As for $\lim_{n\to\infty}\mathbb E[M_k]$, consider that the maximum possible roll is growing larger with each roll. So it makes sense that the limit would be zero.