Poisson Process expectation of time of an event given number of events until that time shows Uniform distribution characteristics

270 Views Asked by At

Question: On a weekday, buses arrive to a certain stop with respect to a Poisson process with rate $\lambda = 2$ per hour. A regular weekday is assumed to start at 6:00 AM.

Given that exactly $10$ buses have arrived before noon (12:00 AM), what is the expected arrival time of the tenth bus?

Solution: Define

$S_i$: Arrival time of the $i^{th}$ bus.

$N_t$: Number of bus arrivals until time $t$.

Then we are asked to find $E[\ S_{10}|\ N_6 = 10\ ]$. Then we have

$$E[\ S_{10}|\ N_6 = 10\ ] = \int_{t=0}^{\infty}P(S_{10}>t|\ N_6 = 10)dt$$ $$= \int_{t=0}^{6}P(S_{10}>t|\ N_6 = 10)dt+\int_{t=6}^{\infty}P(S_{10}>t|\ N_6 = 10)dt$$

But $P(S_{10}>t|\ N_6 = 10) = 0$ therefore,

$$ = \int_{t=0}^{6}P(S_{10}>t|\ N_6 = 10)dt$$ $$ = \int_{t=0}^{6}\big[1-P(S_{10}\le t|\ N_6 = 10)\big]dt$$ $$ = \int_{t=0}^{6}\big[1-P(N_t \ge 10|\ N_6 = 10)\big]dt$$ $$ = \int_{t=0}^{6}\big[1-P(N_t = 10|\ N_6 = 10)\big]dt$$

since $t \le 6$ and $N_6 = 10$ implies that $N_t \le 10$. We also have $N_t \ge 10$ so $N_t = 10$. Therefore,

$$ = \int_{t=0}^{6}\bigg[1-\bigg(\frac{t}{6}\bigg)^{10}\bigg]dt = \frac{60}{11}$$

Now, looking at the answer, it is like in a time interval of length $6$, we have $10$ events which divide the interval into $11$ pieces uniformly and tenth event is at time $\frac{60}{11}$.

Now my question is, is there any combinatorial or intuitional argument in order to reach this conclusion without doing above calculations? Why does this expectation behave like a uniform distribution expectation? Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

For a Poisson process with rate $\lambda$, the following general result holds.

Given $N_{t_0} = n$, the conditional joint distribution of $S_1, \ldots, S_n$ has the same joint distribution as the joint distribution of the order statistics $U_{(1)} < \cdots < U_{(n)}$ of $n$ i.i.d. $\text{Uniform}(0, t_0)$ random variables $U_1, \ldots, U_n$.

This result is much stronger than you need for this particular problem, but helps explain why you observe something related to uniform random variables in your computations.

If we use the above result, your problem becomes $$E[U_{(n)}] = \int_0^{t_0} P(U_{(n)} > t) \, dt = \int_0^{t_0} (1 - (t/t_0)^n) \ dt$$ for $t_0 = 6$ and $n=10$ which is the resemblance you observed.


To prove the above result, I show now that the conditional joint density of $(S_1, \ldots, S_n)$ given $N_{t_0} = n$ is precisely that of the joint density of the order statistics of $n$ i.i.d. $\text{Uniform}(0, t_0)$ random variables.

For $0 < s_1 < \cdots < s_n < t_0$ the density is $$f_{S_1, \ldots, S_n \mid N_{t_0} = n}(s_1, \ldots, s_n) \, ds_1, \cdots \, ds_n \approx \frac{P(S_1 \in [s_1 + ds_1), \ldots, S_n \in [s_n + ds_n), N_{t_0} = n)}{P(N_{t_0} = n)}.$$ The numerator is \begin{align} &P(S_1 \in [s_1 + ds_1), \ldots, S_n \in [s_n + ds_n), N_{t_0} = n) \\ &= P(\text{one arrival in each interval $[s_i, s_i + ds_i)$ for $i=1,\ldots, n$, and no other arrivals in $[0, t_0]$}) \\ &= P(N_{[s_1,s_1 + ds_1)} = 1) \cdots P(N_{[s_n,s_n + ds_n)} = 1) P(\text{no other arrivals in $[0, t_0]$}) \\ &\approx \lambda\, ds_1 \cdot \lambda \, ds_2 \cdots \lambda \, ds_n \cdot e^{-\lambda t_0} \\ &= \lambda^n e^{-\lambda t_0} \, ds_1 \cdots \, ds_n \end{align} Above, we used the fact that $P(N_{[s_i,s_i + ds_i)} = 1) = e^{-\lambda \, ds_i} (\lambda \, ds_i) \approx \lambda \, ds_i$ since $e^{-\lambda \, ds_i} = 1 + O(\,ds_i)$.

Dividing by $P(N_{t_0} = n) = e^{-\lambda t_0} (\lambda t_0)^n/n!$ yields the conditional density $$\frac{1}{n!} \, ds_1 \cdots \, ds_n.$$

There are various arguments to show that this is the density of $(U_{(1)}, \ldots, U_{(n)})$, the order statistics of i.i.d. $\text{Uniform}(0, t_0)$ random variables.