A question on the defintion of Markov process

36 Views Asked by At

Included in the defintion of the Markov process is the following

$\text{ for } x \in \mathbb{R}^d,s,t \ge 0, \Gamma \in \mathcal{B}(\mathbb{R}^d)$ $$, P^x[X_{t+s} \in \Gamma \mid X_s=y]=P^y[X_t \in \Gamma],P^xX_s^{-1}\text{- a.s.} y $$

Now what is bothering me is that if $X_s$ has a continuous density this conditional probability is not well defined as the set $\{X_s=y\}$ has measure zero for all $y$. But for any nice stochastic process to be markov, it has to satisfy this condition(and some others which I havent written here). How do I resolve this paradox?

1

There are 1 best solutions below

3
On BEST ANSWER

We can write $P^{}[X_{t+s}\in \Gamma |X_s]$ as $f(X_s)$ for some measurable function $f$. In Markov property LHS is interpreted as $f(y)$.