Poisson Process with Changing Rate

217 Views Asked by At

Suppose we have an infinite number of counters (numbered $0,1,2,\dots$) which fire (i.e. increment by $1$) independently according to a Poisson process of rate $1$. All counters are initially switched off. At time $0$ we turn on counter $0$. Each time counter $0$ fires, the next counter which is still off turns on (e.g. the first time counter $0$ fires, counter $1$ turns on, the second time counter $0$ fires, counter $2$ turns on, etc.). What is the expected number of fires over all counters in the interval $[1,2]$?


Here is my attempt: Let $T_i$ denote the number of times counter $i$ fires in $[1,2]$, and let $T=T_0+T_1+T_2+\dots$. Then $$\mathbb{E}[T]=\sum_{i=0}^{\infty}\mathbb{E}[T_i]$$ Note $\mathbb{E}[T_0]=\lambda(2-1)=1$. $\mathbb{E}[T_1]$ is a little trickier, but we can compute it by conditioning on the first firing time $F_0$ of counter 0: $$\mathbb{E}[T_1]=\int_0^{2}\mathbb{E}[T_1|F_0=t]p_{F_0}(t)dt=\int_0^{1}\mathbb{E}[T_1|F_0=t]p_{F_0}(t)dt+\int_1^{2}\mathbb{E}[T_1|F_0=t]p_{F_0}(t)dt=$$ $$\lambda(2-1)\cdot\mathbb{P}(F_0\leq 1)+\int_1^2\lambda(2-t)\cdot p_{F_0}(t)dt$$ Here $\mathbb{P}(F_0\leq t)=1-e^{-\lambda t}=1-e^{-t}\implies p_{F_0}(t)=\frac{d}{dt}\mathbb{P}(F_0\leq t)=e^{-t}$, so the above becomes $$1-e^{-1}+\int_1^2(2-t)e^{-t}dt=1-e^{-1}+e^{-2}$$

Now we can proceed similarly for $\mathbb{E}[T_2]$, but it does seem to get a little messy. Namely, conditioning on the event $F_1=t$ seems trickier to do because now we need $p_{F_1}(t)=\int_0^tp_{F_0}(t_0)p_{F_0}(t-t_0)dt_0=\int_0^te^{-t}dt_0=te^{-t}$. If we compute this expectation and compare to the previous iteration, we can probably start to see some pattern emerge.

However, does anyone have a cleaner approach (without all this messy algebra)? Any ideas would be appreciated (even if it's just building on my attempt)! Thanks!

2

There are 2 best solutions below

3
On BEST ANSWER

Addendum: this answer computes an expression for $E[T_i]$ that is quite complicated. However, it is possible to compute $E[T]$ directly without computing each $E[T_i]$; see Matthew H's answer.


Your work seems reasonable to me. In general, if $S_i$ is the time of the $i$th firing of counter zero (note that $S_1$ is your $F_0$, and $S_2$ is your $F_1$), you've essentially shown that

$$E[T_i] = E[T_i \mathbf{1}_{S_i \le 1}] + E[T_i \mathbf{1}_{1 < S_i < 2}] = P(S_i \le 1) + E[(2-S_i) \mathbf{1}_{1 < S_i < 2}].$$

Let $N_{[a,b]}$ denote the number of times counter zero fires in $[a,b]$. We know $$P(S_i \le 1) = 1 - P(N_{[0, 1]} \le i-1) = 1 - e^{-1} \sum_{j=0}^{i-1} \frac{1}{j!}.$$

For the second term, I think finding the density of $S_i$ is unavoidable, due to the necessary truncation onto the interval $[1, 2]$. Your induction idea would work, and would eventually lead you to $f_{S_i}(t) = \frac{1}{(i-1)!} t^{i-1} e^{-t}$, the density of the $\text{Gamma}(i,1)$ distribution.


In order to compute $E[(2-S_i) \mathbf{1}_{1 < S_i < 2}]$ it suffices to compute $E[\mathbf{1}_{1 < S_i < 2}]$ and $E[S_i \mathbf{1}_{1 < S_i < 2}]$. The first term is $$ E[\mathbf{1}_{1 < S_i < 2}] = P(1 < S_i < 2) = P(N_{[0,1]} \le i-1) - P(N_{[0, 2]} \le i-1) = e^{-1} \sum_{j=0}^{i-1} \frac{1}{j!} - e^{-2} \sum_{j=0}^{i-1} \frac{2^j}{j!}.$$

The second term can be rewritten into something resembling the first term. $$E[S_i \mathbf{1}_{1 < S_i < 2}] = \int_1^2 t \frac{1}{(i-1)!} t^{i-1} e^{-t} \, dt = i \int_1^2 \frac{1}{i!} t^i e^{-t} \, dt = i E[\mathbf{1}_{1 < S_{i+1} < 2}] = i \left(e^{-1} \sum_{j=0}^{i} \frac{1}{j!} - e^{-2} \sum_{j=0}^{i} \frac{2^j}{j!}\right).$$


Putting everything together, we have

\begin{align} E[T_i] &= 1 - e^{-1} \sum_{j=0}^{i-1} \frac{1}{j!} + 2\left(e^{-1} \sum_{j=0}^{i-1} \frac{1}{j!} - e^{-2} \sum_{j=0}^{i-1} \frac{2^j}{j!} \right) - i\left(e^{-1} \sum_{j=0}^{i} \frac{1}{j!} - e^{-2} \sum_{j=0}^{i} \frac{2^j}{j!}\right) \\ &= 1 - (i-1) e^{-1} \sum_{j=0}^{i-1} \frac{1}{j!} + (i-2) e^{-2} \sum_{j=0}^{i-1} \frac{2^j}{j!} + i\left(e^{-2} \frac{2^i}{i!} - e^{-1} \frac{1}{i!}\right) \end{align}

When $i=1$ we have $$E[T_1] = 1 - 0 + (-1) e^{-2} + (2e^{-2} - e^{-1}) = 1 - e^{-1} + e^{-2},$$ which corresponds with your result.

7
On

Here is how you can compute $\mathbb{E}(T)$ without computing each $\mathbb{E}(T_i)$.

For $t>0$ let $N_t$ represent the total number of firings on $[0,t]$. Take $X_t$ as the number of times counter $0$ fired on $[0,t]$. Clearly $N_t=0$ if $X_t=0$, so suppose $X_t=k$ for $k=1,2,3,...$

Let $S_1,...,S_k$ denote the $k$ firing times by counter $0$ on $[0,t]$.

Since the expected number of firings made by counter $j$ on $[0,t]$ for $1\leq j \leq k$ is $t-S_j$ we can say for any $k\geq 1$

$$\mathbb{E}(N_t|X_t=k,S_1,...,S_k)=k+\sum_{j=1}^k(t-S_j)=k+kt-\sum_{j=1}^kS_j$$ We can express the above relationship as $$\mathbb{E}(N_t|X_t,S_1,...,S_{X_t})=(1+t)X_t-\sum_{j=1}^{X_{t}}S_j$$It is well known that the joint distribution of $(S_1,...,S_k)$ has the same joint distribution as $(Y_{(1)},...,Y_{(k)})$ where $Y_1,...,Y_k\sim \mathcal{U}[0,t]$ are iid. (This is only true when conditioning on the number of arrivals!) Therefore, $$\begin{eqnarray*} \mathbb{E}(N_t) &=& \mathbb{E}\left(\mathbb{E}(N_t|X_t,S_1,...,S_{X_{t}})\right) \\ &=& \mathbb{E}\left((1+t)X_t-\sum_{j=1}^{X_{t}}S_j\right) \\ &=& (1+t)\mathbb{E}(X_t)-\mathbb{E}\left(\sum_{j=1}^{X_t}Y_{(j)}\right) \\ &=& (1+t)t-\mathbb{E}\left(\sum_{j=1}^{X_t}Y_j\right) \\ &=& (1+t)t-\mathbb{E}(X_t)\mathbb{E}(Y_1) \\ &=& \frac{t^2}{2}+t\end{eqnarray*}$$ Finally, $$\mathbb{E}\left(\text{# of arrivals on [1,2]}\right)=\mathbb{E}(N_2-N_1)=\mathbb{E}(N_2)-\mathbb{E}(N_1)=\frac{5}{2}$$ This turns out to equal $\sum_{j=0}^{\infty}\mathbb{E}(T_j)$ using the formula for $\mathbb{E}(T_j)$ developed by @angryavian