How to find probability transition matrix for continuous time markov chain?

304 Views Asked by At

In Grimmet and Stirzaker, on page 258 it explains how to find transition probabilities, given a generator matrix:

(a) nothing happens during $(t,t+h)$ with probability $1+g_{ii}*h+o(h)$

(b) the chain jumps to state $j(\ne i)$ with probability $g_{ij}*h+o(h)$

Clearly the above is a function of the incremental time into the future, $h$. Then, observing equation (8) a little bit below on the same page 258, we can see that by taking the limit as h goes to 0 of $\frac{1}{h}(P_h – I)=G$. That makes sense for the above (a) and (b).

Now look at the example (15) on page 260: The generator matrix is $$ \begin{bmatrix} -\alpha & \alpha \\ \beta & -\beta \end{bmatrix} $$

And, solving for the transition probabilities (as shown on Page 267 and 268) we get the following: $$P_t= \begin{bmatrix} \frac{\beta+\alpha*e^{-(\alpha+\beta)*t}}{\beta+\alpha} & \frac{\alpha-\alpha*e^{-(\alpha+\beta)*t}}{\beta+\alpha} \\ \frac{\beta-\beta*e^{-(\alpha+\beta)*t}}{\beta+\alpha} & \frac{\alpha+\beta*e^{-(\alpha+\beta)*t}}{\beta+\alpha} \end{bmatrix} $$ using (a) and (b) on page 258, I would understand the transition probabilities to be

$p_{00}=1+g_{ii}*h+o(h) = 1-\alpha *h+o(h)$

$p_{01}=g_{ij}*h+o(h) = \beta*h+o(h)$

$p_{11}=1+g_{ii}*h+o(h) = 1-\beta *h+o(h)$

$p_{10}=g_{ij}*h+o(h) = \alpha*h+o(h)$

Which set of transition probabilities is correct?