Poisson Processes with Gamma Arrivals

1.3k Views Asked by At

I think the title is the best description I can give of my problem (but I'm not 100% sure - the problem set-up has me very confused).

So, given a sequence of i.i.d. Gamma RV having parameters 3, $\lambda$, where for t $\ge$0, $$P(T_n \le t) = \int_0^t \frac{\lambda(\lambda x)^2e^{-\lambda x}}{2}dx.$$

Then, I'm told that {$ {N(t); t \ge 0} $} can be defined as $$ N(t) = \sum_{n=1}^\infty 1(S_n \le t), t\ge0$$ {$1(S_n \le t)$} being the indicator function

Where for each integer $n \ge 1$, $S_n = T_1 + ... + T_n.$

I am asked to derive an expression for $P(N(t) = k)$, for each integer $k \ge0$.

A hint given is: Use the fact that each $T_n$ can be interpreted as the sum of three i.i.d. exponential random variables, having rate $\lambda$.

My answer follows this idea: This is essentially asking me to evaluate a Poisson counting process (I think), and because the inter arrival times $T_n$ are Gamma (3,$\lambda$), it should simplify to a counting process with 3 i.i.d. arrival processes each with rate $\lambda$ - This also means it should be an Erlang process, but that is irrelevant.

So, using the base representation for a Poisson Process, $$P(N(t) = k) = e^{-\lambda t} \frac{(\lambda t)^k}{k!}$$ I think the rate ($\lambda$) for my new process should be $\frac{\lambda}{3}$, giving me a final result of: $$P(N(t) = k) = e^{\frac{-\lambda t}{3}} \frac{(\frac{\lambda t}{3})^k}{k!}$$ Does this make sense to anyone else? The only reason I am concerned is this seemed far too simple for the course that I am currently taking.

2

There are 2 best solutions below

2
On

Using the hint and your intuition that $T_n$ is the time for three arrivals of a Poisson process of rate $\lambda$, I would have thought that $N(t)=k$ was the event that $3k$, $3k+1$ or $3k+2$ arrivals happened in time $t$, with probability

$$P(N(t) = k) = e^{{-\lambda t} }\dfrac{(\lambda t)^{3k}}{(3k)!}+e^{{-\lambda t} }\dfrac{(\lambda t)^{3k+1}}{(3k+1)!}+e^{{-\lambda t} }\dfrac{(\lambda t)^{3k+2}}{(3k+2)!}$$

Simulation in R seems to confirm my suggested expression

> cases <- 10^5
> shape <- 3
> rate <- 4
> t <- 2
> samples <- 10*ceiling(t*shape/rate)
> N <- rep(NA, cases)
> 
> set.seed(1)
> for (i in 1:cases){
+   N[i] <- sum(cumsum(rgamma(samples, shape=shape, rate=rate )) < t)
+   }
> table(N)/cases # simulated probabilities for N(t)
N
      0       1       2       3       4       5       6       7 
0.01328 0.17638 0.40065 0.29863 0.09385 0.01559 0.00150 0.00012 
> 
> j <- 0:max(N)
> prob_Henry <- exp(-rate*t )*(rate*t)^(3*j)/factorial(3*j) + 
+   exp(-rate*t )*(rate*t)^(3*j+1)/factorial(3*j + 1) +
+   exp(-rate*t )*(rate*t)^(3*j+2)/factorial(3*j + 2) 
> names(prob_Henry) <- j
> prob_Henry # Henry's suggested probabilities for N(t)
           0            1            2            3            4            5 
1.375397e-02 1.774821e-01 4.013113e-01 2.955287e-01 9.466701e-02 1.566273e-02 
           6            7 
1.500294e-03 9.024243e-05 
> 
> prob_mrgstephe <- exp(-rate*t/3 )*(rate*t/3)^j/factorial(j)
> names(prob_mrgstephe) <- j
> prob_mrgstephe # mrgstephe's suggested probabilities for N(t)
         0          1          2          3          4          5          6 
0.06948345 0.18528920 0.24705227 0.21960202 0.14640135 0.07808072 0.03470254 
         7 
0.01322002 
2
On

The basic idea stems from the relationship $$\Pr[S_k \le t] = \Pr[N(t) \ge k].$$ This is because $S_k$ is the random time of the $k^{\rm th}$ event, and $N(t)$ is the number of events that have occurred up to time $t$. Intuitively, we can see that the probability that the $k^{\rm th}$ event occurs before time $t$ is equivalent to the probability that at least $k$ events occur by time $t$.

Next, note that because $S_k = T_1 + T_2 + \cdots + T_k$; that is to say, the time until the $k^{\rm th}$ event is the sum of the interarrival times of each event up to the $k^{\rm th}$ event, and because the sum of IID gamma variables is itself gamma, we have $$S_k \sim \operatorname{Gamma}(3k,\lambda), \quad f_{S_k}(x) = \frac{\lambda (\lambda x)^{3k-1} e^{-\lambda x}}{\Gamma(3k)}.$$ Consequently, $$\begin{align*} \Pr[N(t) = k] &= \Pr[N(t) \ge k] - \Pr[N(t) \ge k+1] \\ &= \Pr[S_k \le t] - \Pr[S_{k+1} \le t] \\ &= \int_{x=0}^t \frac{\lambda (\lambda x)^{3k-1} e^{-\lambda x}}{\Gamma(3k)} - \int_{x=0}^t \frac{\lambda (\lambda x)^{3k+2} e^{-\lambda x}}{\Gamma(3k+3)} \, dx.\end{align*}$$ We now employ integration by parts on the second integral with the choice $$u = (\lambda x)^{3k+2}, \quad du = (3k+2)\lambda (\lambda x)^{3k+1} \, dx, \\ dv = \lambda e^{-\lambda x} \, dx, \quad v = - e^{-\lambda x},$$ to obtain $$\begin{align*} \Pr[S_{k+1} \le t] &= \left[-e^{-\lambda x} \frac{(\lambda x)^{3k+2}}{(3k+2)!} \right]_{x=0}^t + \int_{x=0}^t \frac{\lambda(\lambda x)^{3k+1}e^{-\lambda x}}{\Gamma(3k+2)} \, dx \\ &= -e^{-\lambda t} \frac{(\lambda t)^{3k+2}}{(3k+2)!} + \int_{x=0}^t \frac{\lambda(\lambda x)^{3k+1}e^{-\lambda x}}{\Gamma(3k+2)} \, dx .\end{align*}$$ And we can notice that if we repeat the process twice more on the remaining integral, we would get $$\Pr[S_{k+1} \le t] = - e^{-\lambda t} \left( \frac{(\lambda t)^{3k+2}}{(3k+2)!} + \frac{(\lambda t)^{3k+1}}{(3k+1)!} + \frac{(\lambda t)^{3k}}{(3k)!}\right) + \Pr[S_k \le t].$$ This proves $$\Pr[N(t) = k] = e^{-\lambda t} \left( \frac{(\lambda t)^{3k+2}}{(3k+2)!} + \frac{(\lambda t)^{3k+1}}{(3k+1)!} + \frac{(\lambda t)^{3k}}{(3k)!}\right),$$ which happens to correspond to the sum of Poisson probabilities for a process that has rate $\lambda$, for a number of events being $3k$, $3k+1$, or $3k+2$.