Using transition matrix and initial distribution to calculate probabilities. (Markov chains)

535 Views Asked by At

Let a markov chain with state space $\{1,2,3,4\}$ and transition matrix

$$P=\begin{pmatrix}1/3 && 1/3 && 0 && 1/3 \\ 1/4 && 1/4 && 1/4 && 1/4 \\ 0 && 0 && 1/2 && 1/2 \\ 0 && 0 && 0 && 1\end{pmatrix}$$

and initial distribution $\lambda=(1/2,1/2,0,0)$ be given.

How can I use this to calculate

$\mathbb P[X_0=2, X_1=1, X_2=2, X_3=1]$ and

$\mathbb P[X_0=2, X_2=2, X_3=1]$

?

I tried the following:

$P(X_0=i_0,X_1=i_1,...,X_n=i_n)=P(X_0=i_0)P(X_1=i_1|X_0=i_0)\dots P(X_n=i_n|X_{n-1}=i_{n-1})$

So $P[X_0=2, X_1=1, X_2=2, X_3=1]=P(X_0=2)P(X_1=1|X_0=2)P(X_2=2|X_1=1)P(X_3=1|X_2=2)=\lambda_0 P_{21} P_{12}P_{21}=1/2\times1/4\times1/3\times1/4$

1

There are 1 best solutions below

3
On

$P[X_0=2, X_2=2] = \sum_{i=1}^{4}P[X_2=2|(X_1=i,X_0=2) ].P[X_1=i,X_0=2] $

$= \left( \sum_{i=1}^{4}P[X_2=2|(X_1=i,X_0=2) ].P[X_1=i|X_0=2] \right) . P[X_0 =2] $

since $X_2$ only depends on the value of $X_1$

$= \left( \sum_{i=1}^{4}P[X_2=2|X_1=i].P[X_1=i|X_0=2]\right) . P[X_0 =2] $

so

$P[X_0=2, X_2=2, X_3 =1] = $

$P[X3=1|X_2=2] .\left(\sum_{i=1}^{4}P[X_2=2|X_1=i].P[X_1=i|X_0=2]\right) . P[X_0 =2] $

and the sum is just two vector multiplication