Markov Chain reaching an absorbing state after hitting a certain state

393 Views Asked by At

I'm stucked in this markov chain problem. I need to know the probability of reaching the absorbing state "C" right after hitting the state "B" and starting at the state "A". The matrix transition is:

     A     B      C
    0.79  0.2    0.01   A
    0.7   0.28   0.02   B
     0     0      1     C

Thank you very much in advance.

1

There are 1 best solutions below

10
On BEST ANSWER

Denote the probability of reaching $C$ starting from the point $i$ and going through $B$ by $p_i$. Then we have the two equations $$p_A=0.2p_B+0.79p_A\\p_B=0.02+0.7p_A+0.28 p_B$$ Solve these to find $p_A$, which is what they ask for.


How did we get these equations?

The first equation says the way to reach your target, starting at $A$ is by either first going to $B$ then figuring out how to reach the target from there, or by going straight back to $A$, and figuring out how to reach the target from there.

The second equation says the way to reach your target starting at $B$ is by either going straight to $C$ (which is your target, so you're done), or by going to $A$ then figuring out how to get to your target from there, or by going to $B$ then figuring out how to get to your target from there.

This is a common method used for solving problems like this (Markov processes).