Transition probability matrix of Markov chain

412 Views Asked by At

Given that

$g(x)=\begin{cases} 1/3 \quad\text{for } x=0\\ 1/3 \quad \text{for } x=1\\ 1/3 \quad \text{for } x=2\end{cases}$

Explain why independent draws $X_1,X_2,\dots$ from $g(x)$ give rise to a Markov chain. What is the state space and what is the transition probability matrix $P$?


My thoughts:

$P = \begin{pmatrix} 1/3 & 1/3 & 1/3\\ 1/3 & 1/3 & 1/3\\ 1/3 & 1/3 & 1/3\end{pmatrix}$

I don't see how successive draws depend on each other since they are independent and thus how this is a Markov chain. Can anyone please explain this to me?

1

There are 1 best solutions below

1
On BEST ANSWER

Markov Chains are stochastic processes $\{X_n\}$ such that $X_n$ depends only on $X_{n-1}$, in the sense that $\mathbb{P}(X_n = a_n \mid X_{n-1} = a_{n-1}, \ldots, X_1 = a_1) = \mathbb{P}(X_n = a_n \mid X_{n-1} = a_{n-1})$. This still holds here, as both sides equal $1/3$, although the "dependence" is in this case is rather trivial.