It is Question 30 on page 168 in Ross's book (Introduction to Probability Models-11th edition)
Let $X_i, i\ge 0$ be independent and identically distributed random variables with probability mass function
$p(j)=P(X_i=j), j=1,\ldots,m, \sum_{j=1}^m P(j) = 1$
Find $E[N]$, where $N=\min\{n>0: X_n = X_0\}$
The same question was here, but I don't have enough credit to comment on. Find $E[N]$, where $N = \min\{n>0: X_n = X_0\}$
Here are my questions:
What is the sample space for it? Can anyone give me a simple example? or any references?
What does $X_n = X_0$ mean for 2 random variables with the same distribution?
Is it possible to resolve it with conditional probability?
Are there any applications of such a question? What is the point behind the question?
Any help would be greatly appreciated.
Update 1: The Solution says:
E[N] = $\sum_{j=1}^m$ E[N|Xo = j] * p(j) = $\sum_{j=1}^m (1/p(j)) * p(j) = m$
I am totally confused, so I want to confirm a few things.
- $N=\min\{n>0: X_n = X_0\}$: Is this the least index of indices of Xi that are equal to X0 (having the same j with X0)?
Using nothing but definitions we get
\begin{align} \mathbb E(N) &=\sum_{n\geq 1}\,n\cdot\mathbb P(N=n)\\ &=\sum_{n\geq 1}\,n\cdot\mathbb P(\min\{k\in\mathbb N\,:\,X_k=X_0\}=n)\\ &=\mathbb P\left(X_1=X_0\right)+\sum_{n\geq 2}\,n\cdot \mathbb P\left(\Big(X_n=X_0\Big)\cap\left(\bigcap_{k=1}^{n-1}\Big(X_k\neq X_0\Big)\right)\right) \end{align}
Now, we may use the law of total probability to calculate each of the terms above. We'll have
\begin{align} \mathbb P\left(X_1=X_0\right) =&\sum_{j\geq1}\,\mathbb P\left(X_1=j\,|\,X_0=j\right)\cdot\mathbb P\left(X_0=j\right)\\ \stackrel{\text{indep.}}{=}&\sum_{j\geq1}\,\mathbb P\left(X_1=j\right)\cdot\mathbb P\left(X_0=j\right)\\ \stackrel{\text{iid}}{=}&\sum_{j\geq1}\,{\mathbb P\left(X_0=j\right)}^2 \tag{1} \end{align}
Similarly:
\begin{align} \mathbb P\left(\Big(X_n=X_0\Big)\cap\left(\bigcap_{k=1}^{n-1}\Big(X_k\neq X_0\Big)\right)\right)& \\ =\sum_{j\geq1}\,\mathbb P\left(\Big(X_n=j\Big)\cap\left(\bigcap_{k=1}^{n-1}\Big(X_k\neq j\Big)\right)\,\middle|\,X_0=j\right)\cdot\mathbb P\left(X_0=j\right)&\\ \stackrel{\text{indep. and iid}}{=}\,\sum_{j\geq1}\,{\mathbb P\left(X_0=j\right)}^2\cdot{\mathbb P\left(X_0\neq j\right)}^{n-1}&& \tag{2} \end{align}
So we can see that the formula in $(2)$ also applies to $(1)$, with $n=1$. Hence:
$$\mathbb E(N) =\sum_{n\geq 1}\,n\cdot \left(\sum_{j\geq1}\,{\mathbb P\left(X_0=j\right)}^2\cdot{\mathbb P\left(X_0\neq j\right)}^{n-1}\right) \tag{3}$$
For our particular case, $(3)$ reduces to
$$\mathbb E(N) =\sum_{n\geq1}\,n\cdot \left(\sum_{j=1}^m\,p(j)\,^2\cdot{\big(1-p(j)\big)}^{n-1}\right) $$
We can then change summation order to obtain
$$\mathbb E(N) =\sum_{j=1}^m\,p(j)\,^2\cdot \underbrace{\left(\sum_{n\geq1}\,n\cdot{\big(1-p(j)\big)}^{n-1}\right)}_{(*)} $$
To calculate $(*)$, consider the identitiy
$$\sum_{n\geq 0}x^n=\frac1{1-x},$$
which holds whenver $|x|<1$. Differentiate both sides to obtain
$$\sum_{n\geq 1}nx^{n-1}=\frac1{(1-x)^2}.$$
We can apply the identity above to $(*)$ with $x=1-p(j) \iff 1- x = p(j)$, yielding
$$\mathbb E(N) =\sum_{j=1}^m\,p(j)\,^2\cdot \frac{1}{p(j)\,^2} =\sum_{j=1}^m\,1=m $$
as the solution claims.