Let $x_1=\exp(\lambda_1),x_2=\exp(\lambda_2)$, $x_1, x_2$ are independent show $P(x_1<x_2)=\frac{\lambda_1}{\lambda_1+\lambda_2}$

712 Views Asked by At

Let $x_1=\exp(\lambda_1),x_2=\exp(\lambda_2)$, and $x_1, x_2$ are independent random variables.

Show that $P(x_1<x_2)=\dfrac{\lambda_1}{\lambda_1+\lambda_2}$

1

There are 1 best solutions below

2
On BEST ANSWER

Hint: Just calculate

$$P(X_1<X_2)=\int_0^{\infty} \int_0^{x_2} \lambda_1\cdot e^{-\lambda_1\cdot x_1}\cdot \lambda_2\cdot e^{-\lambda_2\cdot x_2} dx_1 dx_2$$

$$=\lambda_1\cdot \lambda_2\cdot\int_0^{\infty} e^{-\lambda_2\cdot x_2}\cdot \left( \int_0^{x_2} e^{-\lambda_1\cdot x_1} dx_1 \right) dx_2$$