The expected value of the sum of random variables of random length.

51 Views Asked by At

Suppose, we first roll an $N$-sided dice and let $X$ be the respective random variable. Next, we roll $X$ times an $X$-sided dice. Let $Y_1,\dots,Y_X$ be the respective random variables of the respective dice rolls.

Question: Is there a way to simplify the expected value of the sum $\sum_{i=1}^X Y_i$?

According to Wald's equality, if $X$ is independent of the sequence $Y_1,Y_2,\dots$, then we get $E[X]\cdot E[Y]$ as the answer. However, in the scenario I'm describing there is a dependency between $X$ and $Y_1,\dots,Y_X$.

1

There are 1 best solutions below

2
On BEST ANSWER

Presumably, the faces of the $X$ sided die are represented by the elements in the set $\{1, 2, \ldots, X\}$. Therefore, the average roll of an $X$ sided die is

$$\frac{X+1}{2}.$$

So, if you roll an $X$ sided die $X$ times, you should expect the sum of the $X$ rolls to be

$$X \cdot \frac{X+1}{2}.$$

The simplest resolution is to forgo any elegance or heavy machinery (i.e. theorems).

Simply assume that the experiment is done $N$ times, with the first roll taking on each of the numbers in the set $\{1,2,\ldots,N\}$. Therefore, the expected value of the sum is

\begin{align*} &\frac{1}{N} \cdot \left[ \, \sum_{X=1}^{N} \left( X \cdot \frac{X + 1}{2} \right) \, \right] \\ &\hspace{1.5em} = \frac{1}{2N} \cdot \left[ \, \left( \sum_{X=1}^{N} X^2 \right) + \left( \sum_{X=1}^{N} X \right) \, \right]. \end{align*}

Note that

$$ \sum_{k=1}^n k^2 = \frac{n^3}{3} + \frac{n^2}{2} + \frac{n}{6} \hspace{1.5em}\text{and}\hspace{1.5em} \sum_{k=1}^n k = \frac{n^2}{2} + \frac{n}{2}.$$