Showing continuity of Markov processes

39 Views Asked by At

Consider the following concepts of continuity of Markov transition functions:

Let $(P_t)$ be a Markov transition function on $(E, \mathcal{E})$ and let $H$ a set of bounded, real-valued $\mathcal{E}$-measurable functions. $(P_t)$ is called

strongly continuous on $H$ if for each $f \in H$ we have

$$\lim_{t \downarrow 0} || P_tf-P_0f||_{\infty} = 0$$

weakly continuous on $H$ if for each $f \in H$ and $x \in E$ we have

$$\lim_{t \downarrow 0} (P_tf)(x) = (P_0f)(x).$$

I'm struggling to show these properties for given transition functions. Especially ways of showing strong continuity seem very hard to me since I can't think of theorems for showing this kind uniform convergence in a probabilistic setting.What are comon ways to prove these properties? What are some examples for transition functions fulfilling or not fulfilling these properties?