Markov Process In Discrete Time

43 Views Asked by At

I'm having problems understanding a simple discrete-time Markov chain. The state space $S=(1,2,3)$ has a transition matrix

$$P = \begin{pmatrix} 1/3 & 1/3 & 1/3 \\ 0 & 2/3 & 1/3 \\ 2/3 & 1/3 & 0 \end{pmatrix}.$$

The initial distribution given is

$$\textbf{P}(X_0 = 1) = \textbf{P}(X_0 = 2) = \textbf{P}(X_0 = 3) = 1/3.$$

How would I calculate the probability $\textbf{P}(X_3 = 3)$ or $\textbf{P}(X_3 \in (2,3)) ?$

From notes I know that

$$P^{n}_{i,j} = P(X_n = j | X_0=i). $$

So if I want to find $\textbf{P}(X_3 \in(2,3))$, I would want to calculate $P^3$ which I find to be

$$P^3 = \begin{pmatrix} 7/27 & 13/27 & 7/27 \\ 2/9 & 14/27 & 7/27 \\ 8/27 & 13/27 & 2/9 \end{pmatrix}.$$ $$\textbf{P}(X_3=2|X_0=1) = P^3_{1,2} = 13/27$$ $$\textbf{P}(X_3=2|X_0=2) = P^3_{2,2} = 14/27$$ $$\textbf{P}(X_3=2|X_0=3) = P^3_{3,2} = 13/27$$

I'm not sure this is correct, as I don't take into account $X_0?$ Could someone point out anything I'm doing wrong...

1

There are 1 best solutions below

5
On BEST ANSWER

We don't need to take into account the probability distribution of $X_0$ because you are calculating conditionnal probability : What is the probability that $X_3=3$ knowing that $X_0$ takes a specific value.

Then, you'll use the distribution of $X_0$ in the following formula :

$$ Pr[X_3=2] = \sum_{i=1}^3 Pr(X_3=2 \mid X_0=i) \cdot Pr(X_0=i)$$