Let $\{X(t): t \geq 0\}$ be a continuous-time Markov chain. Let $T_{i}$ denote the holding time in state $i$. Then we have the following proposition:
Could you please explain what does it mean mathematically by "the process starts out in state $i$"? Thank you so much!


I convert @астон вілла олоф мэллбэрг's comment as answer to close this question.