I have to show that the following stochastic process is a Markov process

119 Views Asked by At

I don't understand how to show that some stochastic processes have the Markov property. For example, if I have the following process:

$$(\Omega, \mathcal{F}, (X_t)_{t \geq 0}, P^y)$$ where $\Omega = \mathbb{R}$, $\mathcal{F} = \mathcal{B}_{\mathbb{R}}$, $P^y = \delta_y$ (Dirac probability), $y \in \mathbb{R}$ and for every $t \geq 0$ $$ Y_t(\omega) = \omega + t, \omega \in \Omega $$

how can I show one of the equivalent forms of Markov Property? For example: $$ \mathbb{P}(X_t \in A | \mathcal{F}_t^X) = \mathbb{P} (X_t \in A | X_s) $$ where $s, t \geq 0$ and $s < t$ or, equivalently, $$ \mathbb{P} (A \cap B | X_t) = \mathbb{P} (A | X_t) \mathbb{P} (B | X_t) $$ for all $A \in \mathcal{F}_t^X$, $B \in \sigma (X_s : s \geq t)$?

Thank you!