Transition functions induced by Markov processes

69 Views Asked by At

Let $(\Omega,\mathcal{F},\mathbb{P})$ be a probability space and denote by $(X_t,\mathcal{F}_t)_{t\geq 0}$ a time-continuous Markov process with values in $(E,\mathcal{E})$. For $s<t\in [0,\infty)$, $x\in E$ and $A\in \mathcal{E}$, define $P_{s,t}(x,A):=\mathbb{P}[X_t\in A \vert X_s=x ]$.

Is $P_{s,t}$ a Markov transition function?

To check this, we need to make sure (among other things), that

(i) $P_{s,t}(x,\cdot)$ is a probability measure for all $x\in E$ - this is clear.

(ii) $P_{s,t}(\cdot, A)$ is measurable for all $A\in\mathcal{E}$ - this is not clear to me.

To check (ii), fix $A\in\mathcal{E}$. We need to show that $P_{s,t}(\cdot,A):(E,\mathcal{E})\rightarrow ([0,1],\sigma([0,1]))$ is measurable. For this fix, $R\in \sigma([0,1])$. We must show $P_{s,t}(\cdot,A)^{-1}(R)=\{x\in E:P_{s,t}(x,A)\in R\}\in \mathcal{E}$. I see that $P_{s,t}(\cdot,A)^{-1}(R)=\{x\in E: \mathbb{P}(X_t\in A\vert X_s=x)\in R\}=\{X_s(\omega)\in E:\omega\in\mathbb{P}(X_t\in A\vert X_s)^{-1}(R) \}$. Now, I see that $\mathbb{P}(X_t\in A\vert X_s)^{-1}(R)\in \mathcal{F}$. But is then necessarily $\{X_s(\omega)\in E:\omega\in\mathbb{P}(X_t\in A\vert X_s)^{-1}(R) \}\in\mathcal{E}$? Generally, the image of a measurable set under a measurable map need to be measurable, right?