Prove Number of Arrivals $N(s)$ up to time $s$ follows $\mathrm{Poisson}(\lambda s)$ Distribution

242 Views Asked by At

This comes from my self-study of Durrett's "Essentials of Stochastic Processes" book, page 97.

Definition Let $\tau_1,\tau_2,\ldots$ be independent $\mathrm{exponential}(\lambda)$ random variables. Let $T_n=\tau_1+\tau_2+\cdots+\tau_n$ for $n\ge1$, $T_0=0$, and define $N(s)=\max \{n:T_n \le s\}$.

He says, "we think of the $\tau_n$ as the times between arrivals of customers at a bank, so $T_n=\tau_1+\cdots+\tau_n$ is the arrival time of the $n^{th}$ customer, and $N(s)$ is the number of arrivals by time $s$. To check the last interpretation, consider the following example: note that $N(s)=4$ when $T_4 \le s <T_5$, that is, the fourth customer has arrived by time $s$ but the fifth has not.

Claim $N(s)$ has a Poisson distribution with mean $\lambda s$.

Durrett gives a sketch of the proof, but with very sparse details. He says, "now $N(s)=n$ if and only if $T_n \le s < T_{n+1}$, i.e. the $n^{th}$ customer arrives before time $s$ but the $(n+1)th$ after $s$. Breaking things down according to the value of $T_n=t$ and noting that for $T_{n+1}>s$, we must have $\tau_{n+1}>s-t$, and $T_{n+1}$ is independent of $T_n$, it follows that $$P(N(s)=n)= \int_0^s f_{T_n}(t) P(\tau_{n+1}>s-t)dt$$

Me: It is hard to see how he goes from the left hand side of the equation to the right hand side. I'm trying to fill in the details of the proof and try to understand it. Could you help me?


Edit: Thank you for your help! I've got it now. It relies on the following idea. If the random variable $X$ denotes the outcome of a coin toss and $Y$ denotes the outcome of a die roll \begin{equation} P(X=H)=\sum_{y=1}^{6} P(Y=y, X=H) \end{equation}

With that in mind, here's the complete proof:

Proof \begin{align*} P(N(s)=n) &= P(T_n \le s <T_{n+1}) \\ &=\int_0^s P(T_n=t \le s<T_{n+1})dt \\ &=\int_0^s P(T_n=t\textrm{ and }\tau_{n+1}>s-t)dt \\ &=\int_0^s f_{T_n}(t) P(\tau_{n+1}>s-t)dt \textrm{ (by independence?)} \\ &=\int_0^s \lambda e ^{-\lambda t} \frac{(\lambda t)^{n-1}}{(n-1)!} P(\tau_{n+1}>s-t)dt \textrm{ (sum of exponentials $T_n$ is $gamma(n,\lambda)$) } \\ &=\int_0^s \lambda e ^{-\lambda t} \frac{(\lambda t)^{n-1}}{(n-1)!} (1-F_\tau (s-t)) \\ &=\int_0^s \lambda e ^{-\lambda t} \frac{(\lambda t)^{n-1}}{(n-1)!} e^{-\lambda(s-t)} \\ &=e^{-\lambda s} \frac{\lambda^n}{(n-1)!} \int_0^s t^{n-1} dt \\ &=e^{-\lambda s} \frac{\lambda^n}{(n-1)!} \frac{s^n}{n!} \\ &=e^{-\lambda s} \frac{(\lambda s)^n}{n!} \end{align*} Where the last line is $P(X=n)$ when $X \sim \mathrm{Poisson}(\lambda s)$.

1

There are 1 best solutions below

1
On BEST ANSWER

Your attempt breaks down at the second = sign when some mysterious $t$ appears (what is $t$ here?). Instead, one should write $$ (\ast)=P(T_n \leqslant s <T_{n+1})=P(T_n\leqslant s\lt T_n+\tau_{n+1}). $$ Now, $(T_n,\tau_{n+1})$ is independent and one knows the distributions of $T_n$ and $\tau_{n+1}$ hence one can evaluate the RHS. Say the distribution of $T_n$ has density $g_n$ and the distribution of $\tau_{n+1}$ has density $h$, then, by definition $$ (\ast)=\int_0^sg_n(t)\int_{s-t}^\infty h(u)\mathrm du\mathrm ds. $$ What is left to do is to identify $g_n$ and $h$. It seems that you are aware that $h(u)=\lambda\mathrm e^{-\lambda u}\mathbf 1_{u\gt0}$, and probably also of what $g_n$ is, so you could plug them into the RHS and deduce the value of $(\ast)=P(N(s)=n)$. Can you do that?

If you do this carefully, you will realize that the case $n=0$ is specific because $T_0=0$ has no density. To solve this case, simply use the identity $$ [N(s)=0]=[\tau_1\gt s]. $$