Continuous time Markov Chain's Natural Filtration

216 Views Asked by At

Given a continuous time Markov chain $\left(X_t \right)_{t\geq 0} $ with finite or countable state space $S$, transition matrix $P(t)$, what I want to prove is:

$$\text{Let} \quad f:S \to \mathbb{R} \; : \; \mathbb{E}[|f(X_t)|]<\infty \; \;\forall t $$

$$ \Rightarrow \mathbb{E}[f(X_t)\mid \mathcal F_s] =(P(t-s)f) (X_s)=\sum_{j\in S} f(j)p_{X_s j}(t-s)$$

Where $(\mathcal F_t)_{t\geq 0}$ is the natural filtration of the process

To prove this my book uses a characterization of the events belonging to $\mathcal F_s$

$$ A \in \mathcal F_s \; \, \text{is the union of disjoint events like} \left\{X_{s_1}=i_1,..,X_{s_n}=i_n \right\} $$

From this point the proof is quite straightforward, but I cannot prove that assumption!

2

There are 2 best solutions below

0
On

The events in $\mathcal F_s$ are of the form $\left\{\omega\mid (X_{s_i}(\omega))_{i\geqslant 1}\in B\right\}$, where $(s_i)_{i\geqslant 1}$ is a sequence of non-negative real numbers smaller than $s$ and $B$ a Borel subset of the space sequences with values in $S$ endowed with the product topology (I assume we put the discrete topology on $S$). We can check that this collection of sets is a $\sigma$-algebra, and contains $\sigma(X_u)$ for each $0\leqslant u\leqslant s$.

2
On

Let's begin by using the Markov property to rewrite $$p_{ij}(t-s)=\mathbb{P}(X_t=j\mid X_s=i)=\mathbb{P}(X_t=j\mid X_s=i,A)$$ where $A$ is any set of the form $\left\{X_{s_1}=i_1,\dots,X_{s_n}=i_n \right\}$ for $s_j\leq s$ for $j=1,\dots, n$. Multiplying up we get $$p_{ij}(t-s)\mathbb{P}(X_s=i,A) =\mathbb{P}(X_t=j, X_s=i,A).$$

We now rewrite this using indicator random variables and expectations: $$\mathbb{E}\left(p_{ij}(t-s){\bf 1}_{(X_s=i,A)}\right) =\mathbb{E}\left({\bf 1}_{(X_t=j, X_s=i,A)}\right),$$ or $$\mathbb{E}\left(p_{X_s\,j}(t-s){\bf 1}_{(X_s=i,A)}\right) =\mathbb{E}\left({\bf 1}_{(X_t=j, X_s=i,A)}\right).$$ Adding over $i\in S$ gives $$\mathbb{E}\left(p_{X_s\,j}(t-s){\bf 1}_{A}\right) =\mathbb{E}\left({\bf 1}_{(X_t=j,A)}\right).$$ Multiplying by $f(j)$ and adding over $j\in S$ gives $$\mathbb{E}\left(\sum_{j\in S} f(j) p_{X_s\,j}(t-s){\bf 1}_{A}\right) =\mathbb{E}\left(f(X_t){\bf 1}_{A}\right).$$

Since this equation is true for all $A$ belonging to a $\pi$-system that generates $\cal F_s,$ we conclude that $$\mathbb{E}\left(\sum_{j\in S} f(j) p_{X_s\,j}(t-s)\mid{\cal F}_s\right) =\mathbb{E}\left(f(X_t)\mid{\cal F}_s\right),$$ that is, $$ \sum_{j\in S} f(j) p_{X_s\,j}(t-s) =\mathbb{E}\left(f(X_t)\mid{\cal F}_s\right).$$