How is time until transition defined in this lecture note of continuous-time Markov chain?

42 Views Asked by At

I'm reading lecture note Continuous time Markov chains by Alejandro Ribeiro:


Continuous time positive variable $t \in \mathbb{R}^{+}$. States $X(t)$ taking values in countable set, e.g., $0,1, \ldots, i, \ldots$. Stochastic process $\{X(t) \mid t\ge 0\}$ on the probability space $(\Omega, \mathcal{G}, \mathbb{P})$ is a continuous time Markov chain (CTMC) if $$ \begin{aligned} & \mathbb{P}[X(t+s)=j | X(s)=i, X(u)=x(u), u<s] \\ =& \mathbb{P}[X(t+s)=j | X(s)=i] \end{aligned} $$

Notation:

  • $X[s: t]$ state values for all times $s \leq u \leq t$, includes borders.

  • $X(s: t)$ values for all times $s<u<t$, borders excluded.

  • $X(s: t]$ values for all times $s<u \leq t$, exclude left, include right.

  • $X[s: t)$ values for all times $s \leq u<t$, include left, exclude right.

Homogeneous CTMC if $\mathbb{P}[X(t+s)=j | X(s)=i]$ constant for all $s$. Still need $\mathbb{P}[X(t+s)=j | X(s)=i]$ for all $t$ and pairs $(i, j)$. We restrict consideration to homogeneous CTMCs.

  • $T_{i}=$ time until transition out of state $i$ into any other state $j$.

  • $T_{i}$ is a RV called transition time with cdf $$ \mathbb{P}\left[T_{i}>t\right]=\mathbb{P}[X(0: t]=i | X(0)=i] $$


I would like to ask how $T_i$ is defined. Is it defined on the same probability space $(\Omega, \mathcal{G}, \mathbb{P})$? If so, what is $T_i (\omega)$ for $\omega \in \Omega$?

1

There are 1 best solutions below

9
On BEST ANSWER

Yes, your transition time is defined on the same probability space. If your process starts a.s. in the point $i$, i.e. $X(0)=i$, it sounds for me to be defined as $$T_i:=\inf\{t>0\mid X(t)\neq i \}.$$