Markov Chain Transition Probability

165 Views Asked by At

When dealing with markov chains, say I am in state 0 on day 1, is the probability that I will be in state 0 on day 4 equal to the probability that I will be in state 0 on all of day 2, day 3 and day 4? I believe these are equivalent because to get from day 1 to day 4, you have to transition day 1-day 2 , day 2 - day 3 and day 3 - day 4.

2

There are 2 best solutions below

0
On

It is not necessary to be in the same states on days 2 and 3. For example, the states could be $0 \rightarrow 1 \rightarrow 1 \rightarrow 0$ or $0 \rightarrow 1 \rightarrow 2 \rightarrow 0$ and so on.

3
On

No, they are not equivalent. To be in state 0 on day 4, you may be in state 1 on day 2, stae 2 on day 3 and jump to 0 on day 4. Any path which ends at 0 on day 4 will be good enough.

Being in that state all the time (not going anywhere) is one of the possibilities, but not necessarily the exclusive one.

If it is not possible to jump away from state 0, then the two are equivalent and happen with probability 1 (give that you are in state 0 on day 1 already).