Transition matrix for defined Markov chain

31 Views Asked by At

Consider a sequence of iid random variables $ \lbrace \xi_n, \; n=0,1,2,\ldots\rbrace $ with mass probabilities $P(\xi_n=0)=0.1,P(\xi_n=1)=0.5, $ and $P(\xi_n=2)=0.4.$ Define a Markov chain $(X_n)_{n\geq 0}$ on the state space $\mathcal{S}=\lbrace 0,1,2 \rbrace$ using the rule $X_n = \min(X_{n-1},\xi_n).$ Write the transition matrix for this Markov chain.

I am fairly new to Markov chains and this is the first time I have approached a question that defines it with this type of rule. I am unsure how to approach this question and would appreciate any type of help.