Markov property definition, for (continuous) markov processes

74 Views Asked by At

We have defined markov processes (in continuous time) as a collection of random variables $(X(t))_{t\in \mathbb{R_+}}$ such that in particular we have the property :

$P(X(t+s)=j|X(u), 0\leq u \leq t)=P(X(t+s)=j|X(t))$

So my question is : is this really the conditional probability on a random variable, $P(A|X)$ (I know this concept exists). Or does it really mean

$P(X(t+s)=j|X(t)=i, X(u)=x(u), 0\leq u < t)=P(X(t+s)=j|X(t)=i)$?

1

There are 1 best solutions below

0
On BEST ANSWER

The definition in Grimmett & Stirzaker's Probability and Random Process states:

The process $X$ is called a (continuou-time) Markov chain if it satisfies the Markov property: $$P(X(t_n)=j|X(t_1)=i_1,\cdots, X(t_{n-1})=i_{n-1})= P(X(t_n)=j|X(t_{n-1})=t_{n-1})$$ for all $j,i_1,\cdots, i_{n-1} \in S$ and any sequence $t_1<t_2<\cdots <t_n$ of times.

I think this definition will make your understanding more clear.