The first holding time of a continuous-time Markov chain is exponentially distributed

64 Views Asked by At

Let $X = (X_t)_{t\geq 0}$ be a time-homogeneous Markov process on a finite state space $S$ with right-continuous paths and generator $Q$ (sometimes called Q-matrix). Let $T$ be the first jump time (or also the first holding time) defined by

$$ T := \inf \{ t>0: X_t \neq X_0 \}. $$

I would like to show that, given $X_0 = i \in S$, $T$ is exponentially distributed with parameter $q(i) = -Q(i,i)$.

What I have shown is that, given $X_0=i$, $T$ is indeed exponentially distributed with some parameter $\lambda$ by showing it has memorilessness property (which is in turn a consequence of Markov property of $X$), but I got stuck at proving $\lambda = q(i)$.

Thank you for your help :).

1

There are 1 best solutions below

2
On BEST ANSWER

There are 2 ways to show this.

First proof. We are going to define a new process $Y = (Y_t)_{t\geq 0}$ such that $\text{Law}(Y_0) = \text{Law}(X_0):= m_0$, and $Y$ has right continuous paths. Transform your $Q$-matrix to a jump matrix $\Pi$ and define a jump chain $Z = (Z_n)_{n\geq 0}$ that is a Markov$(m_0, \Pi)$ as follows:

  1. $\text{Law}(Z_0) = m_0$ and let $S_0 = 0$
  2. For $n\geq 1$, sample a exponential random variable $S_n$ with parameter $q(Z_{n-1})$ independently with the previous step and sample $Z_n$ according to the law $\Pi(Z_{n-1},\cdot)$.
  3. Let $J_n = \sum^n_{k=0} S_k$. Define $$ Y_t = \sum_{n\geq 0}^\infty Z_n\mathbb 1_{J_n \leq t < J_{n+1}}. $$

Of course, there will be some problems when some $q(i) = 0$, here for simplicity, I assume $q(i) \neq 0, \forall i \in S$ (I believe that you can treat the case $q(i)=0$ on your own). Then, using this construction, you can prove $y_t := \text{Law}(Y_t)$ satisfies

$$ \dot y_t = y_tQ, \quad \forall t >0. $$

Using the right-continuity of sample paths, we deduce $Y$ and $X$ have the same finite dimensional distribution, thus

$$ \mathbb P'(S_1 >t|Y_0 = i) = P(T>t |X_0 =i), \quad (*) $$

where $\mathbb P'$ is a probability measure on some probability space where $Y$ is defined. This is because $\{T>t \} = \{ X_s= X_0, \forall s\leq t \} = \{ X_s= X_0, \forall s \in \mathbb Q\cap [0,t] \}$ by right-continuity, and similar for $\{S_1 >t \}$ in terms of $Y$. Since given $Y_0 =i$, $S_1$ has exponential distribution with parameter $q(i)$ so does $T$ given $X_0$ by $(*)$.

Second proof. This is a direct approach. Note that $$ P(T>t | X_0 = x):= P_x(T>t) = \lim_{n\rightarrow \infty} P_x(X(kt) = x, \forall k = 0,1/2^n,2/2^n...,(2^n-1)/2^n,1). $$ A proof of this fact can be found in this post. Now by Markov property and let $P_t(i,i) = P_i(X_t=i)$ $$ P_i(X(kt) = i, \forall k = 0,1/2^n,2/2^n...,(2^n-1)/2^n,1) = \left(P_{t/2^n}(i,i) \right)^{2^n},$$

so

$$ -\lambda = \frac{\ln P_i(T>t)}{t} = \lim_{n\rightarrow \infty} \frac{2^n}{t} \ln P_{t/2^n}(i,i) = \lim_{h \rightarrow 0} \frac{\ln P_{h}(i,i)}{h} = \frac{Q(i,i)}{P_0(i,i)} = -q(i). $$