Definition of a Markov process on the real numbers.

40 Views Asked by At

I was going through the Grimmett and Stirzaker probability textbook and I stumbled upon this definition for a Markov process on $\mathbb{R}$.

Definition. The continuous-time process $X$, taking values in $\mathbb{R}$, is called a Markov process if the following holds: \begin{multline} \mathbb{P}\bigl( X(t_n) \leq x \mid X(t_1) = x_1, \dots, X(t_{n-1}) = x_{n-1} \bigr) \\ = \mathbb{P}\bigl( X(t_n) \leq x \mid X(t_{n-1}) = x_{n-1} \bigr) \end{multline} for all $x, x_1, x_2, \dots, x_{n-1}$, and all increasing sequences $t_1 < t_2 < \dots < t_n$ of times.

I find it kinda weird because here the probability of the event $X(t_{n-1}) = x_{n-1}$ might be null, so we might not be able to condition on it. I was expecting a definition like the following instead (so that it is valid on the Borel set) : \begin{multline} \mathbb{P}(X(t_n) \leq x \mid X(t_1) \leq x_1, \dots, X(t_{n-1}) \leq x_{n-1}) \\ = \mathbb{P}(X(t_n) \leq x_n \mid X(t_{n-1}) \leq x_{n-1}) \end{multline}