Explanation of Markov transition function

832 Views Asked by At

Here the definition of my course (in the picture below). Could someone explain me the Chapman-Kolomogorov equation ? I don't really understand what it mean. Also, I tried to make a parallel with discrete Markov chain, I don't see the link between continuous and discrete Markov chain. Is the motivation behind the same ? (in discrete time $(X_n)$ is a Markov chain if $$\mathbb P\{X_{n+1}=x\mid \sigma (X_{k}\mid k\leq n)\}=\mathbb P\{X_{n+1}=x\mid X_n\}.$$ Also, $P_{s,t}(x,dy)$ is the regular version of the conditional distribution of $X_t$ given $X_s$... I'm not really sure what it mean, is it $$P_{s,t}(x,dy)=\mathbb P\{X_t\mid X_s\} \ \ ?$$ But it doesn't really make sense.

enter image description here

1

There are 1 best solutions below

0
On

You can think of a continuous-time Markov process as being fully characterized by both an "embedded" discrete-time Markov chain that governs the probabilities of transitions between states (call them $Q_{ij}$), as well as some holding time parameters $\lambda_i$ that represent the average rates at which one transitions out of a state $i$. (Those holding times are always distributed exponentially, so knowing $\lambda_i$ is enough to characterize their distribution.)

You can then construct an evolution equation, sometimes called a master equation, for the continuous-time transition probabilities $P_{ij}(t)$ using these parameters.

The Chapman-Kolmogorov equations for continuous-time Markov processes are "the same thing" they were in the discrete version: an identity for the transition probabilities based on conditioning on an intermediate step and exploiting the Markov property.