Let $X$ and $Y$ be independent random variables such that $X \approx \exp(\lambda_1)$ and $Y \approx \exp(\lambda_2)$. Let $Z=\min(X,Y)$. Show that $\mathbb P(Z = X) = \frac{\lambda_1}{\lambda_1+\lambda_2}$.
I of course wrote that $\mathbb P(Z = X)=\Bbb P(X \leq Y)$ but I don't know how to continue. The solution I was given says that since X and Y are independent, the joint density of $(X,Y)$ is given by $$f_{X,Y} (x, y) = f_X(x)f_Y(y) = \lambda_1e ^{−λ_1x}λ_2e^{−λ_2y}$$
such that $$P(X \leq Y ) = \int_0^\infty \int_x^\infty \lambda_1 \lambda_2 e ^{−λ_1x}e^{−λ_2y}dxdy$$
I really do not understand why is this taken to $\Bbb R^2$ and most importantly with is the last equality true ?
I read this post Finding P(X<Y) for exponential Random Variable that I think uses this post Proving the identity $P( X + Y = a)= \int_{-\infty}^{\infty} P( X + y = a)f_Y(y) \, \text{d}y $ that uses "Law of Iterated Expectation" (I don't know about it).
Is there a way out that does not require this law ?
Why taken to $\mathbb{R}^2$: because you have two random variable, and you are calculating joint distribution of $(x.y)$
How to continue: just do the integral as it is. I don;t think you need to know about "Law of Iterated Expectation" since I don't know it too, but I can do that integration.
It is like
\begin{align} P(X \leq Y ) &= \int_0^\infty \int_x^\infty \lambda_1 \lambda_2 e ^{−λ_1x}e^{−λ_2y}dxdy\\ &= \int_0^\infty \lambda_1 e ^{−λ_1x}e^{−λ_2x}dx\\ &= \int_0^\infty \lambda_1 e ^{−(λ_1+\lambda_2)x}dx=\frac{\lambda_1}{\lambda_1+\lambda_2} \end{align}