I have a Markov process $Q_t$ with a certain jump rate $\sigma$. In this certain Markov process we can only jump to the neighbouring right state and the jumping distance is $\varepsilon>0$. At $Q_0$ the process needs a random amount of time $T_1$ before jumps to the new state. Until it jumps to the next state, it needs another random amount of time, $T_2$, and so on. All those waiting times $T_1, T_2, T_3,\dots$ are exponentially distributed continuous random variables, i.e. $T_i\sim \text{exp}(\sigma)$, which are independent. Then with \begin{align} \widetilde{T_k} = \sum_{i=1}^{k}T_i \end{align} one can define the time of the $k$-th jump or in other words the waiting time till the $k$-th jump.
Now my question would be the following: Is there a way of expressing the whole Markov process in terms of the $\widetilde{T_k}$?