I have a Markov chain with the transition matrix $$\pmatrix{0 & 0.7 & 0.3 \\ 0.8 & 0 & 0.2 \\ 0.6 & 0.4 & 0}$$ and I would like to generate a random sequence between the three states (such as $1, 2, 1, 3, \dots, n$). How do I get there while making sure the transition probabilities roughly apply for my sample?
2026-03-27 03:48:18.1774583298
On
Generate random sample with three-state Markov chain
304 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Well, suppose you have a sequence of iid r.v. $U_t$ uniform on $[0,1]$.
Then define $$n_1(u) = 2\times 1_{u<0.7} + 3\times 1_{u\ge 0.7}\\ n_2(u) = 1\times 1_{u<0.8} + 3\times 1_{u\ge 0.8}\\ n_3(u) = 1\times 1_{u<0.6} + 2\times 1_{u\ge 0.6}\\ X_{t+1} = n_{X_t}(U_t) $$ Then $X$ is a realisation of your Markov chain starting form the point you choose for the start for $t=1$.
Take a partition of the unit interval that mimics your probablities, e.g. for the transition probabilities $0.2, 0.3, 0.5$ take the partition $$ [0,1] = [0, 0.2) \cup [0.2, 0.5) \cup [0.5, 1]. $$ Now pick a random number in the unit interval and see which set it is contained in.