An expression for $P(N_s=k|T_n=t)$ where {$N_t ; t\geq0$} is a Poisson process with parameter $\lambda$

93 Views Asked by At

Problem

  • Let {$N_t ; t\geq0$} be a Poisson process with parameter $\lambda$
  • Solve for $P(N_s=k|T_n=t)$ where $T_n$ is the arrival time of the $n$-th event and $s$ and $k$ are positive integers

My Approach
So there are two cases where I have

  • Case 1: $s>t$ and $k\ge$ $n$
  • Case 2: $s<t$ and $k\leq$ $n-1$

In Case 1, $P(N_s=k|T_n=t)=P(N_s=k)=\frac{{(\lambda}s)^k{e^{-{\lambda}s}}}{k!}$

In Case 2, ($N_s|T_n=t$) ~ $B(n-1,s/t)$ so $P(N_s=k|T_n=t)$ = $\frac{(n-1)!}{k!(n-1-k)!}$$(\frac{s}{t})^k$$(1-\frac{s}{t})^{n-1-k}$

I am unsure in either case if my approach is correct and I also do not now how to prove that in Case 2, $(N_s|T_n=t)$ ~ $B(n-1,s/t)$

1

There are 1 best solutions below

0
On BEST ANSWER

Your result in the first case is not quite right. The intuitive idea is that at time $t$, the first $n$ events have occured, and we are asking the probability that the remaining $k-n$ events happen in the remaining time $s-t$. Since what happens in the remaining time is independent of what has happened until then, it should be the same as asking what the probability is of $k-n$ events happening in the first time $s-t$. More stringently, we may write $$ \mathbb{P}(\mathsf{N}_s=k\mid \mathsf{T}_n=t) =\mathbb{P}(\mathsf{N}_t=n, \mathsf{N}_s-\mathsf{N}_t=k-n\mid \mathsf{T}_n=t) =\mathbb{P}(\mathsf{N}_{s-t}=k-n) =\frac{(\lambda(s-t))^{k-n}e^{-\lambda(s-t)}}{(k-n)!}. $$ In the third step, several things happen. Firstly, $\mathsf{N}_s-\mathsf{N}_t$ is independent of the process at time $t$, and we may split the probability into a product (since also the event $\{\mathsf{T}_n=t\}$ is dependent on the process only up until time $t$). Secondly, the event $\{\mathsf{N}_t=n\}\cap\{\mathsf{T}_n=t\}$ is a sure event given the latter, so its conditional probability is $1$. Finally, since again $\mathsf{N}_s-\mathsf{N}_t$ is independent of the process at time $t$, and hence of $\{\mathsf{T}_n=t\}$, the probability is the same whether we condition or not, and since $\mathsf{N}_s- \mathsf{N}_t\sim \mathsf{N}_{s-t}$ by stationary increments, the equality follows.

As for the second case, your result is correct. Intuitively, given that at time $t$ there are $n$ events, where those events fall should not be given preference of one time over another, i.e. they should be uniform. If this is the case, then each of the $n-1$ event times can be seen as a uniform random variable on the interval $[0, t]$, and whether they fall on one side of $s$ or the other can be viewed as independent Bernoulli variables with probability $s/t$. To see that this is indeed the case, remember that the interarrival times of the process are exponentially distributed with parameter $\lambda$, i.e. $\mathsf{T}_k=\mathsf{X}_1+\mathsf{X}_2+\ldots+\mathsf{X}_k$ where $\mathsf{X}_i\overset{\mathrm{i.i.d.}}{\sim}\mathrm{Exponential}(\lambda)$. As such, the joint density of the first $n$ arrival times by the chain rule is \begin{align*} f_{\mathsf{T}_1,\ldots,\mathsf{T}_n}(t_1,\ldots,t_n) &=f_{\mathsf{T}_1}(t_1)f_{\mathsf{T}_2\mid \mathsf{T}_1}(t_2\mid t_1)\cdots f_{\mathsf{T}_n\mid \mathsf{T}_{n-1}}(t_n\mid t_{n-1}) \\ &=(\lambda \mathrm{e}^{-\lambda t_1})(\lambda \mathrm{e}^{-\lambda(t_2-t_1)})\cdots(\lambda \mathrm{e}^{-\lambda(t_n-t_{n-1})}) \\ &=\lambda^n \mathrm{e}^{-\lambda t_n}, \end{align*} given that $t_1<t_2<\ldots<t_n$. Also, since $\mathsf{T}_n\sim\Gamma(n, \lambda)$ (as the sum of i.i.d. exponentials), we find that the conditional distribution of the first $n-1$ event times given the $n$'th is $$ f_{\mathsf{T}_1,\ldots,\mathsf{T}_{n-1}\mid \mathsf{T}_n}(t_1,\ldots,t_{n-1}\mid t) =\frac{\lambda^n \mathrm{e}^{-\lambda t}}{\frac{\lambda^n}{(n-1)!}t^{n-1}\mathrm{e}^{-\lambda t}} =\frac{(n-1)!}{t^{n-1}}. $$ This is exactly the order statistic of $n-1$ independent uniform random variables on the interval $[0, t]$, i.e. $\mathsf{T}_1,\ldots,\mathsf{T}_{n-1}\mid \mathsf{T}_n\sim \mathsf{U}^{(1)},\mathsf{U}^{(2)},\ldots,\mathsf{U}^{(n-1)}$, where $\mathsf{U}_1,\ldots, \mathsf{U}_{n-1}\overset{\mathrm{i.i.d.}}{\sim}\mathrm{Uniform}[0, \mathsf{T}_n]$ meaning that if we do not care about order (which we don't), we may as well view the first $n-1$ event times as being uniform on $[0, \mathsf{T}_n]$ as desired.