Inter-arrival time distribution of a Poisson process

1.1k Views Asked by At

I'm actually looking at a rigorous way to prove that if $\{N_t\}_{t\geq 0}$is a Poisson process of rate $\lambda >0$, then the inter-arrival times $(H_n)_{n\in \mathbb{N}}$ are independently and identically distributed exponential random variable with rate parameter $\lambda$.

Note that for this I use the two definitions of Poisson process that does not use the above statement as a definition that is (the equivalence between the two definition is already proved):

Def 1: A Poisson process $\{N_t\}_{t\geq 0}$ of rate $\lambda >0$ is a stochastic process with values in $\mathbb{N}$ satisfying:

  1. $N_0=0$ a.s
  2. The increments are independent, that is given $n\in \mathbb{N}$ and $0\leq t_0<\dots<t_n$ $N_{t_0}, N_{t_1}-N_{t_0},\dots,N_{t_{n-1}}-N_{t_n}$ are independent.
  3. $P(N_t-N_s=k)=P(N_{t-s}=k), t\geq s$
  4. $P(N_{t+\delta}-N_t=1)=\lambda\delta+o(\delta)$
  5. $P(N_{t+\delta}-N_t \geq 2)=o(\delta)$

Def 2: A Poisson process $\{N_t\}_{t\geq 0}$ of rate $\lambda >0$ is a stochastic process with values in $\mathbb{N}$ satisfying:

  1. $N_0=0$ a.s
  2. The increments are independent, that is given $n\in \mathbb{N}$ and $0\leq t_0<\dots<t_n$ $N_{t_0}, N_{t_1}-N_{t_0},\dots,N_{t_{n-1}}-N_{t_n}$ are independent.
  3. $P(N_t-N_s=k)=P(N_{t-s}=k), t\geq s$
  4. $N_t \sim Poi(\lambda t)$

What I want to avoid is what seems to be the classic proof that consider object like $P(H_2>t|H_1=t_1)$ which has no sense for me as $H_1$ is continuous ( namely exponential) and $H_2$ could be (at this step) any type of random variable. (and hence $P(H_1=t)=0 \forall t$)

Thank's by advance to the one that will propose me a good proof for this.

2

There are 2 best solutions below

5
On

We will assume that $N$ has right-continuous sample paths. Define the firth arrival time $T_1$ by

$$ T_1 = \inf \{ t \geq 0 : N_t \geq 1 \}. $$

By the right-continuity, this implies that $N_{T_1} = 1$. Then we will prove the following claim, which is a special case of the strong Markov property:

Claim. Define a new stochastic process $\tilde{N} = ( \tilde{N}_t )_{t\geq 0}$ by $$ \tilde{N}_t = N_{T_1 + t} - N_{T_1} = N_{T_1 + t} - 1. $$ Then we have the following:

  1. $T_1$ and $\tilde{N}$ are independent.
  2. $\tilde{N}$ is a Poisson process of rate $\lambda$.
  3. $T_1$ is an exponential random variable of rate $\lambda$.

Proof. Fix $0 < t_0 < t_1 < \cdots < t_n$. Also, let $\delta > 0$ be sufficiently small so that $0 < t_0 - \delta$ and $t_{i-1}+\delta < t_i-\delta$ holds for all $i = 1, \dots, n$.

Now, for any parameters $s, s_1, \dots, s_n \geq 0$, we consider

$$ Y = sT_1 + \sum_{i=1}^{n} s_i (\tilde{N}_{t_i} - \tilde{N}_{t_{i-1}}). $$

Then, conditioned on $\{T_1 \in ((k-1)\delta, k\delta] \}$, we have

$$ Y \geq s (k-1)\delta + \sum_{i=1}^{n} s_i (N_{(k-1)\delta + t_i} - N_{k\delta + t_{i-1}}) $$

and so,

$$ \mathbb{E}[e^{-Y}] \leq \sum_{k=1}^{\infty} \mathbb{E}\left[ \exp\left\{ - s (k-1)\delta - \sum_{i=1}^{n} s_i (N_{(k-1)\delta + t_i} - N_{k\delta + t_{i-1}}) \right\} \mathbf{1}_{\{T_1 \in ((k-1)\delta, k\delta] \}} \right]. \tag{1} $$

By noting that $\{T_1 \in ((k-1)\delta, k\delta] \} = \{ N_{(k-1)\delta} = 0 \text{ and } N_{k\delta} \geq 1 \} $, all of

$$ \mathbf{1}_{\{T_1 \in ((k-1)\delta, k\delta] \}}, \quad N_{(k-1)\delta + t_1} - N_{k\delta + t_{0}}, \quad \dots, \quad N_{(k-1)\delta + t_n} - N_{k\delta + t_{n-1}} $$

are independent, and hence the bound $\text{(1)}$ reduces to

\begin{align*} \mathbb{E}[e^{-Y}] &\leq \sum_{k=1}^{\infty} e^{-s(k-1)\delta} \left( \prod_{i=1}^{n} \mathbb{E}\left[ e^{-s_i (N_{(k-1)\delta + t_i} - N_{k\delta + t_{i-1}})} \right] \right) \mathbb{P} \left( T_1 \in ((k-1)\delta, k\delta] \right) \\ &= \sum_{k=1}^{\infty} e^{-s(k-1)\delta} \left( \prod_{i=1}^{n} e^{-\lambda(t_i - t_{i-1} - \delta) (1 - e^{-s_i}) } \right) e^{-\lambda (k-1)\delta} (1 - e^{-\lambda \delta}) \\ &= \frac{1 - e^{-\lambda \delta}}{1 - e^{-(s+\lambda)\delta}} \prod_{i=1}^{n} e^{-\lambda(t_i - t_{i-1} - \delta) (1 - e^{-s_i}) }. \tag{2} \end{align*}

As $\delta \to 0^+$, the bound $\text{(2)}$ converges to

$$ \mathbb{E}[e^{-Y}] \leq \frac{\lambda}{s+\lambda} \prod_{i=1}^{n} e^{-\lambda(t_i - t_{i-1}) (1 - e^{-s_i}) }. \tag{3} $$

By a similar computation applied to

$$ Y \leq s k\delta + \sum_{i=1}^{n} s_i (N_{k\delta + t_i} - N_{(k-1)\delta + t_{i-1}}), $$

we can also obtain the reverse direction of $\text{(3)}$. So it follows that

$$ \mathbb{E}[e^{-Y}] = \frac{\lambda}{s+\lambda} \prod_{i=1}^{n} e^{-\lambda(t_i - t_{i-1}) (1 - e^{-s_i}) } \tag{4} $$

holds for any choices of parameters $s, s_1, \dots, s_n \geq 0$. However, since the multidimensional Laplace transform uniquely determines the law of a given random vector, this proves that:

  1. $ T_1$, $\tilde{N}_{t_1} - \tilde{N}_{t_{0}}$, $\ldots$, $\tilde{N}_{t_n} - \tilde{N}_{t_{n-1}} $ are mutually independent,

  2. $\tilde{N}_{t_i} - \tilde{N}_{t_{i-1}} \sim \operatorname{Poisson}(\lambda(t_i - t_{i-1}))$ for each $i = 1, \dots, n$, and

  3. $T_1 \sim \operatorname{Exp}(\lambda)$.

Therefore the desired claim follows. $\square$

Now repeatedly applying this claim shows that inter-arrival times are independent exponential random variables of rate $\lambda$ as desired.

0
On

Let me assume the definition 2 stated in the question.
Let times of occurrences be $0=T_0<T_1<T_2<\ldots$ and $N_t$ denote the number of occurrences in $(0, t].$

I use the fact that $$\{T_n\leq t\} \equiv\{N_t\geq n \}.$$
The distribution function of $ T_n $ is \begin{eqnarray} F_{T_n}(t)&=& P[T_n\leq t]=P[N_t\geq n]=P[\cup_{j=n}^\infty \{N_t=j\}] \\ &=&\sum_{j=n}^{\infty}P[N_t=j], \qquad \text{as these events are disjoint} \\ &=& \sum_{j=n}^{\infty} \dfrac{(\lambda t)^j}{j!}e^{-\lambda t}\label{eq:Erl} \end{eqnarray} The probability density function is
\begin{eqnarray*} f_{T_n}(t)=\dfrac{dF_{T_n}(t)}{dt}&=& \lambda e^{-\lambda t}\sum_{j=n}^{\infty} \dfrac{( \lambda t)^{j-1}}{(j-1)!}- \lambda \sum_{j=n}^\infty \dfrac{(\lambda t)^j}{j!}e^{-\lambda t}\\ &&\qquad \text{By letting } Q_j = \dfrac{(\lambda t)^j}{j!}\\ &=& \lambda e^{-\lambda t}[Q_{n-1}+Q_{n}+Q_{n+1}+\ldots]\\ && -\lambda[Q_{n}+Q_{n+1}+\ldots]\\ % &=&\lambda[(Q_{n-1}-Q_n)+(Q_{n}-Q_{n+1})+(Q_{n+1}-Q_{n+2})+\ldots]\\ &=&\lambda e^{-\lambda t} Q_{n-1} \end{eqnarray*}
The density of $T_n$ is given by $$ f_{T_n}(t)=\lambda \dfrac{(\lambda t)^{n-1}}{(n-1)!}e^{-\lambda t} ,\qquad t\geq 0\quad n=1,2,\ldots$$ which is called the density of Erlang distribution of order $ n $
Hence the density of $T_1$ is $\lambda e^{-\lambda t}, \ t\geq 0, $ which is an exponential density with parameter $\lambda. $
Consider the times $T_1$ and $T_2. $ By shifting the origin to $T_1 $, the time of second arrival occurs at $T_2-T_1. $ By the memoryless property of exponential distribution, $T_2-T_1$ is the time to the first occurrence of an arrival in (a new) process starting at $T_1 $ and that $T_2-T_1$ is independent of $T_1. $
Hence $T_2-T_1$ has identical distribution as that of $T_1. $
Proceeding this way, we can show that $T_k-T_{k-1}, k=3,4, \ldots$ are iid and have exponential distribution with parameter $\lambda. $