Markov chain in continuous time (transition probabilities)

130 Views Asked by At

Let $(X_t)_{t \geq 0}$ be a markov chain in continuous time with state space $\mathbb{N}_0$. I want to express

$\mathbb{P}(X_t = 2| X_0 = 1, X_{3t} = 1)$ and

$\mathbb{P}(X_t = 2 | X_0 = 1, X_{3t} = 1, X_{4t} = 1)$

through the $(p_{ij}(t))_{i,j \in \mathbb{N}_0}$.

I know that $\mathbb{P} (X_t = j | X_s = i) = p_{ij}(t)$ for $s < t$. I don't see, how I can use this definition, since there are conditions to the future. If there would be more conditions to the past, I could leave them out due to the markov property, but here the situation is different. I'd be thankful, if you'd help me with this problem!