Finite, irreducible Markov chains - Is the mean arrival time at $j$ always finite?

677 Views Asked by At

We consider an irreducible Markov chain $(X_0,X_1,...)$ with finite state space $S$ and transition probabilities $p_{ij}$. Then, for $j \in S$, we can define the random variable $$ T_j :=\min{\{ n \in \mathbb{N_{\ge1}}\mid X_n=j \} }$$

$T_j$ describes the minimum waiting time for arrival at $j$ and $T_j (\emptyset) := \infty$. Now, in my lecture notes, we introduced the mean arrival time $m_{ij}$ for arrival at $j$ if we start at $i \in S$. It is claimed, that under the above assumptions, $m_{ij}$ is ALWAYS FINITE. Though $m_{ij}$ was not formally defined, I interpreted it as the following conditional expectation value:

$$ m_{ij}=E\left(T_j \mid X_0=i\right)$$

But now my problem is, that I do not understand how to deal with this expectation value because $T_j$ could be infinite (am I right?). It follows from irreducibility that $\sum_{n=0}^{\infty} P(T_j=n) > 0$, but I don't see why this should be sufficient for $m_{ij}$ to be finite. But $P(T_j = \infty)>0$ should still be possible from my point of view. In general, I don't understand how expectation values can be defined properly for random variables that take values in $\mathbb{R} \ \cup \ \{ \infty \}.$ Thanks for any advice!

1

There are 1 best solutions below

0
On

All states of the Markov chain belong to the same communicating class (due to irreducibility) and therefore they are all transient or they are all nrecurrent (main thing they are all the same). Now, due to finiteness they must be all recurrent. So, the probability of returning somewhen to a state $j$ is $1$ not just $>0$. Hence, concerning your first question, no it is not possible that $T_j=\infty$.

By the above arguments (all states communicate and therefore they are the same) they are all null recurrent or they are all positive recurrent. But they if they are all null recurrent this means that the expected time to return to any set from any other set is (in sum) infinite, a contradiction (there are other more precise proofs of the fact, that all recurrent states in a finite Markov Chain are necessarily positive recurrent, see also here).

So, to sum up: irreducibility implies that all sets are the same, because they all communicate. Finiteness excludes infinite times of return (transience) etc.

Lastly, expectation is always defined for non-negative random variables (as $T_j$) in the usual way.