Conditional expectation of 1st arrival in merged poisson process conditioned on 1st arrival comes from process A

1k Views Asked by At

There are two Poisson processes, process A with rate $\lambda$ and process B with rate $\mu$. These to processes (A and B) can be merged and yield again a Poisson process C with rate $\lambda + \mu$. Let D be the event that the first arrival of the merged process C is from process A. Then, the probability of event D is $$P(D)=\frac{\lambda}{\lambda + \mu}$$

Let $T$ be the time until the first arrival in the merged process C. Then the expectation of $T$ is: $$E[T]=\frac{1}{\lambda + \mu}$$

Now my question is: What is the conditional expectation of the time until the first arrival in the merged process C conditioned on the event D (that the first arrival is from the first process, i.e. what is $$E[T\mid D]?$$

Is it $E[T\mid D]=\frac{1}{\lambda}$ or $E[T\mid D]=\frac{1}{\lambda+\mu}$, and especially why?

I tried to find the PDF $f_{T\mid D}$ but I did get nowhere since it comes down to finding the probability $$P(T \geq t \,\cap D)$$ and I could not do that.

The question comes from trying problem 3 of http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-041sc-probabilistic-systems-analysis-and-applied-probability-fall-2013/unit-iii/lecture-15/MIT6_041SCF13_assn07.pdf

2

There are 2 best solutions below

8
On BEST ANSWER

In the Poisson process one can let $T_x$ be the time until the $x$th arrival and $X_t$ be the number of arrivals before time $t$, and then the event $$[T_x>t] \tag T$$ is the same as the event $$[X_t < x]. \tag X$$ So you're looking for $$\operatorname{E}(\min\{T_1^A, T_1^B\} \mid T_1^A<T_1^B) = \operatorname{E}(T_1^A\mid T_1^A<T_1^B). \tag E$$

We have a joint distribution

$$ f_{T_1^A,T_1^B} (s,t) \, ds\,dt = e^{-\lambda s} e^{-\mu t} (\lambda\,ds) (\mu\,dt) \text{ for } s>0,\ t>0 $$ and a conditional distribution given $T_1^A<T_1^B$: $$ f_{T_1^A,T_1^B\,\mid\, T_1^A<T_1^B} (s,t) \, ds\,dt = \frac{e^{-\lambda s} e^{-\mu t} (\lambda\,ds) (\mu\,dt)}{\lambda/(\lambda+\mu)} \text{ for } 0<s<t. $$ So the expected value sought in $(\text{E})$ is \begin{align} & \iint\limits_{\{(s,t)\,:\,0<s<t\}} s \, \frac{e^{-\lambda s} e^{-\mu t} (\lambda\,ds) (\mu\,dt)}{\lambda/(\lambda+\mu)} \\[10pt] = {} & (\lambda + \mu) \int_0^\infty \left( \int_0^t s e^{-\lambda s} e^{-\mu t} \,ds \right) \, (\mu\, dt) \\[10pt] = {} & (\lambda + \mu) \int_0^\infty \left( e^{-\mu t} \left( \int_0^s se^{-\lambda s} \,ds \right) \right) \, (\mu\,dt) \\[10pt] = {} & \frac{\lambda+\mu}{\lambda^2} \int_0^\infty \left( e^{-\mu t} \left( \int_0^t (\lambda s)e^{-\lambda s} (\lambda\,ds) \right) \right) \, (\mu\,dt) \\[10pt] & \text{I am omitting the rest of this because the} \\ & \text{software is refusing to let me see the preview as I work.} \end{align} The inside integral is $$ \int_0^t (\lambda s)e^{-\lambda s} (\lambda\,ds) = \int_0^{\lambda t} u e^{-u} \,du $$

Now recall that $$ \int_0^\infty s e^{-\alpha s}\,ds = \frac 1 {\alpha^2} \int_0^\infty (\alpha s) e^{-\alpha s}\,(\alpha\,ds) = \frac 1 {\alpha^2} \int_0^\infty u e^{-u}\,du = \frac 1 {\alpha^2}. $$

$[\cdots]$

This may bear simplification and may even bear correction of details near the end$\,\ldots\,$I'll be back$\,\ldots$

0
On

This is to elaborate on comments by A. S, and explain how to compute $$P(T \geq t \,\cap D) $$ for $t>0$. Consequently, we will have $T|D$ is memoryless. So is $T|D^c$, by the same argument.

We may rewrite the problem as follows: Let $X$, $Y$ be independent with $$X \sim \mathrm{Exp}(\lambda), \ Y\sim\mathrm{Exp}(\mu).$$ Let $D$ be the event that $X<Y$, and $T=\mathrm{min}(X,Y)$. We will show that $$P(T\geq t \cap D)=P(X\geq t)P(Y\geq t)P(D).$$ Consider the conditional probability $$P(X<Y | X\geq t, Y\geq t)=P(T\geq t \cap D)/P(X\geq t, Y\geq t)$$ By the memoryless property, the distribution of $X-t$ and $Y-t$ given that $X\geq t$, $Y\geq t$ are identical to that of $X$ and $Y$. Also note that $X<Y$ if and only if $X-t < Y-t$. This shows that the conditional probability is indeed $P(D)$. i.e. $$P(X<Y | X\geq t, Y\geq t)=P(D).$$ By independence of $X$ and $Y$, we have $P(X\geq t, Y\geq t) = P(X\geq t)P(Y\geq t)$. Combining everything together, we obtain that $$P(T\geq t \cap D)=P(X\geq t)P(Y\geq t)P(D)= e^{-(\lambda+\mu) t} \frac{\lambda}{\lambda+\mu}.$$