Finding Probabilities with Markov Chains

69 Views Asked by At

I am very new to Markov chains, and during my basic practice I've encountered a few problems that I need clarification of.

I am given a time-homogenous Markov chain $\{X_n\}$ with state space $\{0,1,2,3\}$ with a $4\times4$ transition matrix P, and told that $X_0 \sim$ Bin$(3,0.5)$.

One of the questions is to find the probability of $P(X_0=0 \mid X_1=1, X_2 = 3)$.

In the Markov chains I have come into contact thus far, it has always been such that the final state succeeds the conditional state. That is to say, in $P(X_i=0 \mid X_j=1)$, $i \gt j$ so that I interpret a succeeding state given a preceding state. So how it is possible in this case to find a preceding state only given the succeeding state? This seems to defy intuition for me.

In addition, what role exactly does the Binomial distribution play in this? Does this mean I need to multiply the binomial probability into each state, since that is the probability of that state occurring at that step?

Thank you in advance!

1

There are 1 best solutions below

0
On

We can use the Bayes theorem,

\begin{align} &P(X_0 = 0|X_1 = 1, X_2=3) \\&= \frac{P(X_1=1, X_2=3|X_0=0)P(X_0=0)}{P(X_1=1, X_2=3)}\\ &=\frac{P(X_1=1, X_2=3|X_0=0)P(X_0=0)}{P(X_1=1, X_2=3|X_0=0)P(X_0=0) +P(X_1=1, X_2=3|X_0=1)P(X_0=1) }\\ &=\frac{P(X_2=3|X_1=1)P(X_1=1|X_0=0)P(X_0=0)}{P(X_2=3|X_1=1)P(X_1=1|X_0=0)P(X_0=0)+P(X_2=3|X_1=1)P(X_1=1|X_0=1)P(X_0=1)}\\ &=\frac{P(X_1=1|X_0=0)P(X_0=0)}{P(X_1=1|X_0=0)P(X_0=0)+P(X_1=1|X_0=1)P(X_0=1)} \end{align}

You are told the the transition probabilities and the distribution of $X_0$. Hence you should have enough unformation to compute the above quantity.