Markov chain with known duration

166 Views Asked by At

Let's say there are 2 states (A & B), where the probability of going from state A to B in interval i is $P_{ab}$. State B always lasts for n intervals, then always goes to back to state A.

I want to know what is the probability of being in state B at any given interval. Is it correct to calculate that probability like this:

$P(B)= (P_{ab}*n)/(1+P_{ab}*n)$ ?

I see Markov chains being applied to problems similar to this, but I just can't seem to find examples of this exact type of problem.

I don't have enough reputation to embed a picture, but the link shows a diagram of the situation I'm describing.

https://i.stack.imgur.com/Eplf6.png

1

There are 1 best solutions below

3
On BEST ANSWER

One option is to make a larger Markov chain that embeds the history information in the state space. Rather than having 1 State that corresponds to B, you would have $n$ states that correspond to how many intervals you've been in State B (aka "dummy states" that each record how long you've been in state B). Whenever you're in state B, each time interval you would transition to the next dummy state B up until the n-th one. Then transition back to A.

I.e., you would have the following states:

  • State $A$

  • State $B_1$

  • ...

  • State $B_n$

Transition probabilities:

  • State $A$ to State $B_1$ $:P_{ab}$

  • State $B_j$ to $B_{j+1}$ for all $\{j\in 1..n-1\}$ $:1 $

  • State $B_n$ to $A:1$

This should get you your Markov chain. From here, I'll let you think about how to calculate the probability of being in State A and any of the dummy State B's.