How to prove a state is recurrent in a Markov Chain?

2.2k Views Asked by At

Examine the nature of the states of the Markov chain having transition probability matrix link

Here is my approach so far. link

What I did was follow the technique mentioned here. link

However, I quickly realised that this will get complex really fast due to the high number of combinations possible. Hence, I realised there had to be another way. I then learnt that

If i<->j and if i is recurrent then j is recurrent.

That seemed to make things easier. If I can prove one is recurrent, then I can prove they are all recurrent, as all the states communicate with one another. Except, that I am still unsure about how to prove that any of one these are recurrent states. How should I do this? If someone could nudge me in the right direction, I would really appreciate it. The questions that I have found thus far on Stack-Ex have not been able to make me understand what I have to do.

1

There are 1 best solutions below

4
On BEST ANSWER

I think the easiest approach is to split your state space into irreducible classes (ie group the classes that intercommunicate together)

I believe that (1,3,5) is one such class And (2,4) is another irreducible class.

Observe that (2,4) cannot be recurrent as once you leave this class, ie go from state 4 to state 1, then there is no way of getting back to the (2,4) class again. So this is transient.

Since we have a finite state space, there must be at least one (positive) recurrent class, therefore 1,3,5 must be recurrent.

As you said, all states in the same class must be of the same type, ie recurrent ot transient