Problem understanding the definition (Markov Chain)

94 Views Asked by At

I am reading the ''Quasi-Stationary Distributions - Markov Chains, Diffusions, and Dynamical Systems - Pierre Collet, Jaime San Martín, Servet Martínez'', and at the beginning of Chapter $2$, the author writes the following paragraph:

enter image description here

where $\mathcal{B}(\mathcal{X})$ is the Borel $\sigma$-algebra of $\mathcal{X}$.

My question: If $\mathbb{P}_x$ is a family of probabilities in $(\Omega,\mathcal{F})$, and $\Omega$ is the set of the right continues trajectories, how the second bullet makes sense? $A\in \mathcal{B}(\mathcal{X})$ and, as far as I know, $A \not\in\mathcal{F}.$

N.B.: I know the normal definition of Markov Chain, but, I need to understand the definition of this book so that I can make progress in reading him.

1

There are 1 best solutions below

7
On BEST ANSWER

For each $x \in \mathcal X $, $P_x$ is a measure on $(\Omega,\mathcal F)$. For fixed $A \in \mathcal F$, $P_x(A)$ is a well defined number. Define $f:\mathcal X \to \mathcal X$ by $f(x)=P_x(A)$. The second bullet says that this function is mesurable in the sense that for each $A \in \mathcal F$, $\{x:P_x(A) \in B)\} \in \mathcal B(\mathcal X)$ for every Borel set $B$ in $\mathbb R$.