Markov Chain: pmf at future time steps?

313 Views Asked by At

I have the following markov chain with the state-transition probability matrix: $$W = \begin{bmatrix} 0.7 & 0.3 & 0\\ 0.75 & 0.05 & 0.2\\ 1 & 0 & 0 \end{bmatrix} $$

I know the chain is irreducible.

Starting from a discrete uniform distribution, obtain the state pmf at the second and third time steps

I know the following:

Let the pmf of $\mathbb{X_0}$ be $\lambda_0$ then: $$\lambda_n = \lambda_{n-1}W = \lambda_0W^n$$ where $W^n$ represents an $n$-step transition matrix.

The pmf of a discrete uniform distribution is $\frac{1}{n}$ where $n$ is the number of values in this case.

Question:

So, is $\lambda_0 = \frac{1}{3}$ since there are three states? and I simply plug $\lambda_0$ into the above equation, compute powers of $W$ and I'm done? Or am I missing anything?

1

There are 1 best solutions below

1
On BEST ANSWER

You are basically correct, except that discrete uniform means you end up in all states with the same probability, so $$ \lambda_0 = \frac13 \begin{bmatrix} 1 & 1 & 1 \end{bmatrix} $$ and therefore $$ \lambda_1 = \lambda_0 W = \frac13 \begin{bmatrix} 1 & 1 & 1 \end{bmatrix} \begin{bmatrix} 0.7 & 0.3 & 0\\ 0.75 & 0.05 & 0.2\\ 1 & 0 & 0 \end{bmatrix} $$

Can you finish?