Meaning of $dx$ in probability.

98 Views Asked by At

I have read book "Introduction to stochastic calculus with applications - Klebaner".

The author define $P(y,t,x,s)=P(X_t \leq y \big{|} X_s =x)$ Where $X_u$ is stochastic process and $0\leq s < t$.

I found that in this book "Chapman-Kolmogorov equation"

$$P(y,t,x,s)= \int_{-\infty}^{\infty} P(y,t,z,u)P(dz,u,x,s) \text{ for any s<u<t.}$$

By definition $P(dz,u,x,s)= P( X_u \leq dz \big{|} X_s=x)$.

Could you explain the meaning of $P(X_u \leq dz |X_s =x)$?

1

There are 1 best solutions below

0
On

It is just notation for Lebesgue integration: you have fixed $y,t,u,x,s$ and then you integrate $f(z):=P(y,t,z,u)$ against the measure $\mu(A):=P(A,u,x,s)$.