I have one exercise about state of Markov chain. The Markov chain is as shown below:
The answer shows B and F are transient as they could reach to absorbing state. But how can I tell, for example, state C is not transient? As the definition of my text book says "state i is said to be transient if, after leaving state i, the probability that it is ever in state i again < 1.". So how I gonna calculate the probability that after C goes to D or E, it is ever come back to state C again is >=1? Thank you
