Markov Chains : Can anything be said about what happens in between two transition?

111 Views Asked by At

In time homogeneous discrete Markov chains we take a set period for a single transition. In examples we see sometimes depending on the examples the transition period being a a month a week etc. I'm wondering whether anything can be said on what happens during a transition (for example if a transition period is one month what happens after 2 weeks?) or if it is possible to reduce the transition period of the existing model which may in some cases help to improve the predictions. I do not know whether this is a valid question to ask regarding MCs, however I believe it is an interesting question that I should share here. I welcome any answers. Thanks.

1

There are 1 best solutions below

12
On BEST ANSWER

Let $P$ denote the transition matrix of some discrete Markov chain $(X_n)$ indexed by the nonnegative integers. Assume that $P=\mathrm e^Q$ where $Q$ is the generator of a Markov process $(\xi_t)$ indexed by the nonnegative real numbers, then it is natural to realize $(X_n)$ using $(\xi_t)$, by the identity $$X_n=\xi_n,$$ valid for every integer $n$. In other words, one embeds the discrete time process $(X_n)$ into the continuous time process $(\xi_t)$ in the sense that $X_0=\xi_0$ and that $(\xi_t)$ realizes a Markov interpolation between the successive states $X_0$, $X_1$, $X_2$, etc. The transition kernel of this interpolating process during a time interval of length $t$ is $\mathrm e^{tQ}$, for any nonnegative $t$.

The limitations of this approach are twofold:

  • Not every transition matrix is $P=\mathrm e^Q$ for some generator $Q$ (those that are are called embeddable).
  • Some transition matrices are $P=\mathrm e^Q$ for more than one generator $Q$.