Sum of independent random variables following a Homogeneous Poisson Process

171 Views Asked by At

Messages arrive at a computer from two telephone lines according to independent Poisson processes of mean inter-arrival times of 20 milliseconds(Line 1) and 30 milliseconds (Line 2) respectively.

Q.1 What is the expected amount of time until you get a message from each telephone lines?

Solution: No Idea. Please help

Q.2 What is the probability that the first message came from telephone line 1 and vice versa?

Solution i was thinking of: Should i use λ(Line 1)/(λ(Line 1) + (λ(Line 2)), also is λ(Line 1) = 1/20 millisecond ?

Q.3 What is probability that no messages arrive in the first 100 milliseconds?

Solution i was thinking of: Mean of Line 1 + Mean of Line 2 = 50 milliseconds Rate (λ combined) = 1/50 millisecond = 20 In 100 milliseconds i should get 2 messages i.e. λ = 2 P(0 messages in 100 milliseconds) = (e^-2 * (2^0))/0! = 0.135

Q.4 If the first message arrived from Line 2, what is the expected time for the first message to come from Line 1?

Solution: Need help here. I thought about conditional expectation.

Edit: I am not asking for answer or expect you to solve them for me. I just need to know how to solve these questions or guide me to a possible way of solving them. And i thankyou for you time and appreciate your efforts.

1

There are 1 best solutions below

0
On

Let $N(t)$ and $M(t)$ be independent Poisson processes with rates $\lambda$ and $\mu$, respectively. Let $\{T_n\}$ be the jump times of $N(t)$ and $\{S_n\}$ the jump times of $M(t)$ with $T_0=S_0=0$. Then the increments of the jump times $T_j-T_{j-1}$ and $S_j-S_{j-1}$, also known as the holding times, are exponentially distributed with rates $\lambda$ and $\mu$ respectively. Note that for $t\geqslant 0$, $$ \{T_1\vee S_1\leqslant t\} = \{T_1\leqslant t, S_1\leqslant t \}, $$ and hence \begin{align} \mathbb P(T_1\vee S_1\leqslant t) &= \mathbb P(T_1\leqslant t, S_1\leqslant t)\\ &= \mathbb P(T_1\leqslant t)\mathbb P(S_1\leqslant t)\\ &= (1-e^{-\lambda t})(1-e^{-\mu t})\\ &= 1 - (e^{-\lambda t} + e^{-\mu t} + e^{-(\lambda+\mu)t}). \end{align} Since $\mathbb P(T_1\geqslant 0)= \mathbb P(S_1\geqslant 0)=1$ we may compute the expectation of $T_1\vee S_1$ by integrating its survivor function over $(0,\infty)$: $$ \mathbb E[T_1\vee S_1] = \int_0^\infty (e^{-\lambda t} + e^{-\mu t} + e^{-(\lambda+\mu)t})\ \mathsf dt = \frac{(\lambda+\mu)^2+\lambda\mu}{\lambda\mu(\lambda+\mu)}. $$

The probability that the first arrival came from $N(t)$ is simply $\frac\lambda{\lambda+\mu}$, and by symmetry the probability that the first arrival came from $M(t)$ is $\frac\mu{\lambda+\mu}$.

For a fixed $T>0$, we compute the probability that no messages arrive in $[0,T]$ by considering the superimposed process $L(t)=N(t)+M(t)$ which is a Poisson process with rate $\lambda+\mu$ (it is a good exercise to prove this fact). So $$ \mathbb P(L(T)=0) = \mathbb P(T_1\wedge S_1>T) = e^{-(\lambda+\mu)T}. $$

Conditioned on $\{S_1<T_1\}$, the expectation of $T_1$ is \begin{align} \mathbb E[T_1\mid S_1<T_1] &= \frac{\mathbb E[T_1\mathsf 1_{S_1<T_1}]}{\mathbb P(S_1<T_1)}\\ &= \frac{\int_0^\infty \int_0^t t e^{-\lambda t}\ \mathsf ds \ \mathsf dt}{\frac\mu{\lambda+\mu}}\\ &=\frac{2 (\lambda +\mu )}{\lambda ^3 \mu }. \end{align}