Question on Markov chains of expected number of states

232 Views Asked by At

I am confused with an statement from my probability book that has to do with Markov chains. I hope someone could clarify that, if possible....Consider a Markov chain for which $P_{11}=1$ and $P_{ij}=\frac{1}{i-1}$ (j=1,...i-1 and i>1). Let $T_{i}$ be # of transitions needed to go from state i to state 1. A recursive formula for $E[T_{i}]$ can be obtained by conditioning on initial transition: $E[T_{i}]=1+\frac{1}{i-1}\sum_{i-1}^{j}E[Tj]$. Could anyone tell me how this formula is derived? I am lost. Thank you for your help.

1

There are 1 best solutions below

0
On BEST ANSWER

$T_i$ is the expected number of steps that we'll take to get from $i$ to $1$. So if you think about any one of the paths from $i$ to $1$, we can break it into "the first step" and "the rest of the steps".

Clearly the total number steps in the path is "the first step" plus "the rest of the steps".

So the first step in the path takes us to any one of the neighbours of $i$ (say, $j$), with probability $P_{ij}$.

So once we've gone that first step, the expected number of steps remaining in the path is simply the expected number of steps from the state we moved to after the first step.

Hence we can get the formula $E[T_i] = 1 + E[T_i^*]$, where $T^*_i$ is the expected number of steps from the next state (whatever that may be)

And $$E[T_i^*] = \sum\limits_{j=1}^{i-1}P_{ij}E[T_j]$$.

This is because we can go to each of the states from $1$ to $i-1$, each with probability $P_{ij}$. Once we're at that state, the expected number of steps to get to $1$ is $E[T_j]$ (by definition).

Hence, as $P_{ij} = \dfrac{1}{i-1}$:

$$E[T_i] = 1 + \dfrac{1}{i-1}\sum\limits_{j=1}^{i-1}E[T_j]$$

(So, the $1$ is the first step, and each term in the sum is the expected number of steps from the state you end up in after the first step, weighted by the probability of making that first step)