CTMC: If the transition probability is >0 at one moment, then it is also >0 from that moment onwards

128 Views Asked by At

Assume that $\{X_t:t\geq0\}$ is a continuous-time Markov chain with values on a discrete (countable, but possibly not finite) set $S$.

Notation: for $i,j\in S$ and $t\geq0$,

$p_{ij}(t):=\mathrm{P}\{X_t=j|X_0=i\}$.


I want to prove that, for fixed $i$ and $j$ in $S$, the existence of a $s>0$ so that

$p_{ij}(s)>0$

implies

$\forall t>s, p_{ij}(t)>0$.


I've tried lots of things (within my limited knowledge on the subject) and researched as far as I could, but I can't figure it out and I'm extremely frustrated because it seems really easy.

I'll greatly appreciate detailed answers, since I'm afraid of not understanding the intermediate steps.



EDIT: As I predicted, it's actually quite simple, but I still want to understand all the details (see comments below).

1

There are 1 best solutions below

7
On BEST ANSWER

The easiest way to see this is that in a continuous-time Markov chain, you always have a positive probability of staying put in a state for any amount of time.

(The time it takes to leave a state has the $\operatorname{Exp}(\lambda)$ distribution, where $\lambda$ is the sum of all transition rates out of that state, and the exponential distribution is unbounded.)

If we make up the notation $p_i^*(t)$ to denote the probability of staying put in state $i$ for $t$ units of time, then a lower bound on $p_{ij}(t)$ is $p_{ij}(s) \cdot p_j^*(t-s)$: we definitely end up going from $i$ to $j$ in $t$ seconds if we go from $i$ to $j$ in $s$ seconds and then stay put for $t-s$ seconds more. Both factors are positive, so the result is positive.