Expectation of observation of first value of a random variable

312 Views Asked by At

Given $X_k$, k $\geq 0$ be iid random variables with pmf

$$p_j = P(X_1 = j), 1 \leq j \leq m; \sum_{j = 1}^{m}p_j = 1$$

Let T = min{n > $0$ : $X_n$ = $X_o$} denote the first time we observe the initial value again. Compute E[T].

The only thing I can make out is that the $X_n$ seems to be of a binomial distribution, with probability taking on $\frac{1}{m}$. Beyond that, even if I do know what $X_1$ is, with no indication / pattern of how the random variables will behave, how should I go about solving the question?

2

There are 2 best solutions below

10
On BEST ANSWER

Conditional expectation is useful here. Here are some steps:

1) Understand that it suffices to compute $E[T|X_0 = i]$.

2) Understand that $E[T|X_0 = i]$ is the same as the first time you hit $i$ ignoring $X_0$. Show that either:

a) It is a geometric r.v with parameter $p_i$ OR

b) Use a clever recursion which you may have encountered while solving problems of this type.

I will update my answer if you need more help. But try thinking along these lines first.

Update: \begin{eqnarray} E[T] &=& \sum_{k=1}^m E[T|X_0=k]Pr(X_0=k) \\ &=& \sum_{k=1}^m \frac{1}{p_k}.p_k \\ &=& \sum_{k=1}^m 1 \\ &=& m \end{eqnarray}

1
On

It really helps to know what $X_0$ is. We don't, but we won't let that stop us - instead we'll look at all the possible values at once. By the law of total expectation,

$$E(T) = \sum_{j=1}^m E(T|X_0 = j) P(X_0 = j)$$

Now the distribution of $T$ conditional on $X_0 = j$ is something simple.