How to find the expectation of a Poisson process related variable

199 Views Asked by At

Q) Let $X_k\sim \operatorname{Exp}(\lambda)$, $X_k$ are iid which represent the interarrival times of a Poisson process of mean rate $\lambda$. Let $Y_k$ be the arrival times of the events. Let $$Z = \sum_{k=1}^N e^{-(t-Y_k)}$$ where $Y_1<Y_2<\cdots< t$ i.e. $N(t)$ events have occurred in time $t$. Find $EZ, \operatorname{Var}(Z)$.

I have two questions. Does the $EZ$ calculation below look right and is there a simpler way to find $\operatorname{Var}(Z)$ because the expressions I have are large?

$N(t)\sim \operatorname{Pois}(\lambda t)$ and $Y_k\sim \operatorname{Erlang}(k,\lambda)$ because $Y_k$ is the sum of iid exponential r.v's.

$$\begin{align} EZ &= E[E[Z\mid N(t)]] \\ &= E\left[\sum_{k=1}^{N(t)} Ee^{-(t-Y_k)}\right] \\ &= E\left[\sum_{k=1}^{N(t)}\int_0^{\infty} e^{-(t-x)}\frac{\lambda^k x^{k-1}e^{-\lambda x}}{(k-1)!}dx \right] \\ &= E\left[\sum_{k=1}^{N(t)}\frac{\lambda^k e^{-t}}{(k-1)!} \int_0^\infty e^{(1-\lambda )x}x^{k-1}dx \right] \\ \end{align} $$

In general,

$$\int_0^{\infty}e^{ax}x^n = (-a)^{-k-1}\Gamma(n+1)$$

Thus: $$\begin{align} EZ &= E\left[\sum_{k=1}^{N(t)}\frac{\lambda^k e^{-t}}{(k-1)!}\int_0^{\infty} e^{(1-\lambda )x}x^{k-1}dx \right] \\ &= E\left[\sum_{k=1}^{N(t)}\frac{\lambda^k e^{-t}}{(k-1)!} (\lambda - 1)^{-k}\Gamma(k)\right] \\ &= E\left[\sum_{k=1}^{N(t)}\left( \frac{\lambda}{\lambda-1}\right)^k e^{-t} \right] \\ &= e^{-t} E\left[ \frac{1-\left(\frac{\lambda}{\lambda-1}\right)^{N(t)+1}}{1-\frac{\lambda}{\lambda-1}} - 1 \right] \tag{1}\\ \end{align} $$

Note that since $N(t)\sim \operatorname{Pois}(\lambda t)$:

$$ \begin{align} E\left[ \left(\frac{\lambda}{\lambda-1}\right)^{N(t)+1} \right] &= \frac{e^{-\lambda}}{\lambda !}\sum_{k=0}^{\infty} \left(\frac{\lambda}{\lambda-1}\right)^{k+1}\times (\lambda t)^k\\ &= \frac{1}{(\lambda-1)(\lambda-1)!}.e^{-\lambda}\sum_{k=0}^{\infty} \left(\frac{\lambda ^2 t}{\lambda-1}\right)^k \\ &= \frac{e^{-\lambda}}{(\lambda-1)(\lambda-1)!}\times \frac{1}{1-\left(\frac{\lambda ^2 t}{\lambda-1}\right)} \tag{2} \end{align} $$

Plugging $(2)$ in $(1)$, we get $EZ$.

1

There are 1 best solutions below

5
On

Tried an approach that I thought would be simpler, but ended up with a bigger mess than I thought. Posting here in case someone can do something with it, but I'll probably delete it later...

$e^t Z = \sum_{k=1}^\infty e^{Y_k} \mathbf{1}_{\{Y_k \le t\}}$

By Tonelli's theorem, it suffices to compute $E[e^{Y_k} \mathbf{1}_{\{Y_k \le t\}}]$ and sum over $k$.

$$E[e^{Y_k} \mathbf{1}_{\{Y_k \le t\}} \mid N(t) = n] = E[e^{Y_k} \mid N(t) = n] \mathbf{1}_{\{k \le n\}}.$$

Conditioned on $N(t)=n$, the random variable $Y_k$ has the same distribution as the $k$th order statistic of $n$ i.i.d. $\text{Uniform}(0,1)$ random varaibles. (See Theorem 1.5 of these notes.) In particular, the conditional distribution is $\text{Beta}(k, n+1-k)$. Thus the expectation we want can be evaluated using the moment generating function. We have $$E[e^{Y_k} \mathbf{1}_{\{Y_k \le t\}} \mid N(t) = n] =\mathbf{1}_{\{k \le n\}} \left(1 + \sum_{j=1}^\infty \frac{1}{j!} \prod_{r=0}^{j-1}\frac{k+r}{n+1+r}\right) \tag{1}$$

\begin{align} E[e^{Y_k} \mathbf{1}_{\{Y_k \le t\}}] &= \sum_{n=0}^\infty P(N(t)=n) E[e^{Y_k} \mathbf{1}_{\{Y_k \le t\}} \mid N(t) = n] \\ &= \sum_{n=k}^\infty e^{-\lambda t}\frac{(\lambda t)^k}{k!} \left(1 + \sum_{j=1}^\infty \frac{1}{j!} \prod_{r=0}^{j-1}\frac{k+r}{n+1+r}\right). \end{align} Then sum over all $k \ge 1$ and multiply by $e^{-t}$.