In a book I am reading these are the definitions of a time-continuous Markov chain.
It has a finite number, $N$ of states.
For arbitrary $j,k,s$ and $t$ with $s<t$, assume that $X_s=j$ and consider the probability $X_t=k$. This probability does not change if information about the behaviour of the process during the interval $[0,s)$ is added to the knowledge that $X_s=j$.
$\lim\limits_{t \downarrow s}P_{jk}(s,t)=\delta_{jk}$ for all $j,k, s \ge 0$where $\delta_{jk}$ is the Kronecker delta.
For all $j,k, j\ne k$ and all $t \ge 0$:
$\mu_{jk}(t)=\lim\limits_{\Delta t \downarrow 0}\frac{P_{jk}(t,t+\Delta t)}{\Delta t}$,
exits and is continuous in $t$.
I am wondering if instead of number $3$ it should be $\lim\limits_{s \uparrow t}P_{jk}(s,t)=\delta_{jk}$?
The reason I am wondering about this is that condition number $4$ already seems to imply condition number $3$? And the author has this comment after the conditions:
By these four assumptions, it may be shown that for each $s\ge0$, each $P_{jk}(s,\cdot)$ is continuous for all $t\ge s$. Similarly, for each $t>0$, each $P_{jk}(\cdot,t)$ is continuous for all $s$ in the closed interval [0,t].
I am not able to show this comment by the four conditions, only right continuity. But with the suggestion I posted above I am able to prove left-continuity.
What do you guys think? Is the definition wrong, and should it be modified?