Transient/Recurrent Markov chain

367 Views Asked by At

I am currently studying the concept of recurrent and transient states and was wondering about the following: Is this concept dependent on the initial distribution?

Let me take this example: You can go from 1->2 and 2->1 with a positive probability. Also you can go from 3->4 and 4->3 with a positive probability. All other transition have zero probability. Now, if you start in 1 you can never go to 3, so the recurrence time of state 3 is infinite(you will never reach this state). Does this mean that 3 and 4 are transient?

Conversely, if I start in 3, the states 1 and 2 would be transient.

An alternative interpretation could be: Assuming that we were in state 3 once? How likely is it to come back? in that case 3,4 would be a recurrent states and 1,2 as well. So what is the right interpretation?

Furthermore, I would be interested in understanding the different meaning between a null-recurrent and a positive recurrent state? I mean, both correspond to recurrent states, so the markov chain will return to this state infinitely many times, so what is the difference in their interpretation?