Two equivalent definitions of recurrent states in Markov Chain

62 Views Asked by At

I am learning about the classification of states in Markov chain, and confused about two definitions of recurrent states.

State i is recurrent if 'starting from i, and from wherever you can go, there is a way of returning to i '

This one is quite intuitive, and under some circumstances quite convenient.

Another one is defining by the first pass shown as

State i is recurrent if ' $f_{ij} = \sum_{n=1}^{\infty} fij^{(n)} = 1$, where $fij^{(n)}$ denotes the probability of $(X_{n} = j,X_{m} \neq j,m = 1,2,\dots,n-1|X_{0} = i)$'

Can anyone show me the equivalence of these two definitions?

1

There are 1 best solutions below

0
On

$f_{ij}$ is exactly your first definition, quantified mathematically. To see this, observe that \begin{align*} f_{ij} = \sum_{n = 1}^{\infty} f_{ij}^{(n)} &= \sum_{n = 1}^{\infty} \mathbb{P}\{X_{n} = j, \, \, X_{m} \neq j \, \, \text{for} \, \, j \in \{1,2,\dots,n-1\} \, \mid \, X_{0} = i\} \\ &= \mathbb{P} \left( \bigcup_{n = 1}^{\infty} \{X_{n} = j, \, \, X_{m} \neq j \, \, \text{for} \, \, j \in \{1,2,\dots,n-1\} \, \mid \, X_{0} = i \right) \\ &= \mathbb{P}\left( \{X_{n} = j \, \, \text{for some} \, \, j \geq 1 \} \, \mid \, X_{0} = i \right) \end{align*} Hence, given $i,j$, the Markov chain $\{X_{n}\}_{n \in \mathbb{N}}$ started at $i$ "hits" $j$ with probability $1$ if and only if $f_{ij} = 1$.