Solving a Markov Chain

120 Views Asked by At

Let the distribution on variables $(X_t)$ for $t \in N$ satisfy a Markov chain. Each variable can take the values $\{1, 2\}$. We are given the pmfs $$p(X_1=i) = 0.5$$ for $i=1,2$ and $$p(X_{t+1} = j\mid X_t = i) = p_{i,j}$$ where $p_{i,j}$ is the $(i, j)$-th element of the matrix $$P=\begin{pmatrix} 0.3 & 0.7\\ 0.6 & 0.4 \end{pmatrix}$$ Find: $P(X_3 = 2)$ and $p(X_2 = 1\mid X_3 = 2)$.

I'm stuck with how to start this problem. So any hints would be appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER

Hints: By the law of total probability: $$P(X_3=2)=\sum_{i,j\in \{1,2\}}P(X_3=2\mid X_1=i,X_2=j)P(X_1=i,X_2=j)$$ But by the Markov property, $X_3$ is independent of $X_1$, hence the inner term simplifies to $$P(X_3=2)=\sum_{i,j\in \{1,2\}}P(X_3=2\mid X_2=j)P(X_1=i,X_2=j)$$ Now, $P(X_3=2\mid X_2=j)$ is easy to compute (right?) and $$P(X_1=i,X_2=j)=P(X_2=j\mid X_1=i)P(X_1=i)$$ which is again easy to compute (right?) for any $i,j$.

For the second part, just use Bayes rule:

$$P(X_2=1\mid X_3=2)=\frac{P(X_3=2\mid X_2=1)P(X_2=1)}{P(X_3=2)}$$ and of course use the first part to avoid repeating calculations.