Poisson process counting process

1k Views Asked by At

Two individuals, A and B, both require kidney transplants. If she does not receive a new kidney, then A will die after an exponential time with rate $\mu_A$, and B after an exponential time with rate $\mu_b$. New kidneys arrive in accordance with a Poisson process having rate $\lambda$. It has been decided that the first kidney will go to A (or to B if B is alive and A is not at that time) and the next one to B (if still living).

(a) What is the probability that A obtains a new kidney? (b) What is the probability that B obtains a new kidney?

the answer for (a) should just be the probability that the time ,$T_1$, before an ' 'first event' occurs(the kidney arrives) is less than the lifetime of $A$ and since both are exponentially distributed we have $P\{T_1 < A_{lifetime} \} = \frac{\lambda}{\lambda + \mu_a}$

But what about ,(b) What is the probability that B obtains a new kidney? can someone help me

2

There are 2 best solutions below

1
On

What may help you is knowing that the waiting time between Poisson events has an exponential distribution with parameter $\mu = \frac{1}{\lambda}$.

EDIT

B can get a kidney under two circumstances: 1) A predeceases her before 1 kidney arrives 2) two kidneys arrive before B dies. As you state in the question, the former should be $\frac{\mu_A}{\mu_A + \mu_B}$ and the latter should be calculable knowing that a sum of Poissons is itself Poisson, and has its own waiting period.

0
On

Let me say that this answer is not "a pretty solution".

Suppose that the death time of $A$ and $B$ is $T_A$ and $T_B$. The probability that $B$ is getting a kidney is: $$ \Pr\big[\left(T_1<T_A \text{ and } T_2<T_B \right)\text{ or } \left(T_1>T_A \text{ and } T_1<T_B \right)\big]=\\ \Pr\left(T_1<T_A \text{ and } T_2<T_B \right)+\Pr\left(T_1>T_A \text{ and } T_1<T_B \right)\\ $$ Let's look at the fist probability conditioned by $T_A,T_B$. $$ \Pr\left(T_1<T_A \text{ and } T_2<T_B \big| T_A=x,T_B=y\right)=\Pr\left(N(T_A)>1 \text{ and } 2<N(T_B) \big| T_A=x,T_B=y\right)=\\ \mathbf 1(y>x)(1-e^{-\lambda x})(1-e^{-\lambda y}-\lambda ye^{-\lambda y})+\mathbf 1(y<x)(1-e^{-\lambda y}-\lambda ye^{-\lambda y}). $$ Similarly for the other probability we get: $$ \Pr\left(T_1>T_A \text{ and } T_1<T_B \big| T_A=x,T_B=y\right)=\Pr\left(N(T_A)=0 \text{ and } 1<N(T_B) \big| T_A=x,T_B=y\right)=\\ \mathbf 1(y>x)(e^{-\lambda x})(1-e^{-\lambda y}). $$ Therefore the sum of conditional probabilities are: $$ \mathbf 1(y>x)\Big(e^{-\lambda x}(1-e^{-\lambda y})+(1-e^{-\lambda x})(1-e^{-\lambda y}-\lambda ye^{-\lambda y})\Big)+\mathbf 1(y<x)(1-e^{-\lambda y}-\lambda ye^{-\lambda y})\\ =\mathbf 1(y>x)\Big(\lambda e^{-\lambda (x+y)}+(1-e^{-\lambda y}-\lambda ye^{-\lambda y})\Big)+\mathbf 1(y<x)(1-e^{-\lambda y}-\lambda ye^{-\lambda y})\\ =\mathbf 1(y>x)\lambda e^{-\lambda (x+y)}+(1-e^{-\lambda y}-\lambda ye^{-\lambda y}).$$ Now it is enough to find the following expectation (integrals): $$ \mathbb E\Big(\mathbf 1(T_B>T_A)\lambda e^{-\lambda (T_A+T_B)}+(1-e^{-\lambda T_B}-\lambda T_Be^{-\lambda T_B}) \Big). $$ First of all, we have: $$ \mathbb E\Big((1-e^{-\lambda T_B}-\lambda T_Be^{-\lambda T_B}) \Big)=1-\frac{\mu_B}{\lambda+\mu_B}-\frac{\lambda\mu_B}{(\lambda+ \mu_B)^2}=(\frac{\lambda}{\lambda+\mu_B})^2 $$ And then: $$ \mathbb E\Big(\mathbf 1(T_B>T_A)\lambda e^{-\lambda (T_A+T_B)}\Big)=\frac{\mu_A\mu_B\lambda}{(\lambda+\mu_B)(2\lambda+\mu_A+\mu_B)}$$