Probabilities in markov chain

183 Views Asked by At

I have problem with calculating the probability of Markov Chain with 3 states S = {0,1,2}.

I need to calculate $P(X_1=1,X_2=1|X_0=2)$.

In the answers to my workbook I am given solution:

$P(X_1=1,X_2=1|X_0=2) = P(X_2=1|X_1=1,X_0=2)P(X_1=1|X_0=2)P(X_2=1|X_1=1)P(X_1=1|X_0=2),$

but I have trouble understanding what happened in this step (I guess total probability is used, but I don't really get how).

There is also a transition matrix give and starting distribution, but I'm not sure if they are needed here.

Thanks a lot for help.

1

There are 1 best solutions below

3
On BEST ANSWER

There should be another equal sign; the solution should look like this:

$\begin{align}P(X_1=1,X_2=1|X_0=2) &= P(X_2=1|X_1=1,X_0=2)P(X_1=1|X_0=2)\\ &=P(X_2=1|X_1=1)P(X_1=1|X_0=2) \end{align}$

where the first equality uses Bayes rule/the definition of conditional probability, applied partially; compare this with $P(X_2,X_1)=P(X_2|X_1)P(X_1)$

(or multiply both sides of $P(X_1=1,X_2=1|X_0=2) = P(X_2=1|X_1=1,X_0=2)P(X_1=1|X_0=2)$ by $P(X_0=2)$ by zero to check that the equality is true.)

while the second equality uses the Markov property.