Definition of continuous time Markov Process/Chain

155 Views Asked by At

I have some doubts about the definition of continuous time Markov chains. The book I'm reading uses the following definition.

A random process $\{X(t) : t \ge 0\}$ is a Markov process if for all $0 \le s < t$, $\{x(u) : 0 ≤ u ≤ s\}$ and $x(t)$, $$P(X(t) ≤ x(t) | X(u) = x(u), 0 ≤ u ≤ s) = P(X(t) ≤ x(t) | X(s) = x(s)).$$

A Markov chain is a discrete value Markov process.

But for (continuous time) Markov chains I also have this definition: A discrete value random process $\{X(t) : t \ge 0\}$ is a Markov chain if $$t_0 < … < t_n ⇒ P(X(t_n) = x_n | X(t_{n-1}) = x_{n-1} ,…, X(t_0) = x_0) = P(X(t_n) = x_n | X(t_{n-1}) = x_{n-1})$$

Are these two formulations equivalent or is the first more restrictive?

1

There are 1 best solutions below

0
On BEST ANSWER

There are weird things that can happen for continuous time (so I prefer discrete time).

Here is one weird thing: Let $Z(t)$ be a 2-state homogeneous CTMC with state space $\{0,1\}$ and with transition rates $q_{01}=q_{10}=1$. Let $U$ be a random variable that is uniformly distributed over $[0,1]$ and that is independent of the $Z(t)$ process. Define: $$ X(t) = \left\{\begin{array}{cc} Z(t) & \mbox{if $t \neq U$} \\ Z(100) & \mbox{if $t = U$} \end{array}\right.$$

Then $X(t)$ and $Z(t)$ have the same joint distribution when sampled over any deterministic times $0\leq t_0<t_1<...t_n$ (for any positive integer $n$). This is because, with probability 1, these times miss the crazy time $U$. So $X(t)$ satisfies your second definition.

However, $X(t)$ does not satisfy your first definition because if we know the sample path of $X(t)$ over an interval of time $t \in [0, 5]$, that includes knowledge of $X(U)$, which reveals the value of the future state $X(100)$. But if we only know $X(5)$ then we have no knowledge of the future state $X(100)$.