simulating a discrete markov process from a reducible transition rate matrix

230 Views Asked by At

I'm trying to model an irreversible, discrete Markov process. I have a set of states $S$ arranged in a tree-like structure (it is only possible to move from parent vertex to child vertex). I compute the transition rates of parent-to-child arcs (the transition rates are assumed to parametrize exponential random variables characterizing the time to transition), and I aggregate the transition rates in an upper-triangular, weighted Laplacian matrix $L$.

From what I understand, the weighted Laplacian $L$ is the transition rate matrix (infinitesimal generator) describing the rate a continuous Markov chain will transition between states. Based on this question, to simulate from the transition rate matrix, it seems that I need the transition probability matrix (a stochastic matrix).

How might I simulate trajectories from the transition rate matrix given in $L$? Since my process is irreversible, must I take into account the exponential distribution of transition times, or is there an alternative way to "convert" $L$ to a transition probability matrix?

1

There are 1 best solutions below

6
On BEST ANSWER

The simplest way is this (it does not require the structure of a tree). Assume: \begin{align} S &= \mbox{State space}\\ (q_{ij}) &= \mbox{transition rates} \\ v_i &= \sum_{j \in S: j \neq i} q_{ij} \end{align}

  • When you enter a new state $i$, independently generate an exponentially distributed random variable $T_i$ with parameter $v_i$. The time $T_i$ is the time you stay in state $i$.

  • After time $T_i$, independently transition to a new state $j \in S$ according to transition probabilities: $$ P_{ij} = \left\{\begin{array}{ll} \frac{q_{ij}}{v_i} \quad & \mbox{ if $j \neq i$}\\ 0 & \mbox{ if $i=j$}\end{array}\right. $$ Notice that these probabilities indeed sum to 1: $\sum_{j\in S}P_{ij} = 1$.