Suppose I have two cumulative probability distributions: $P(x)$ and $P(y)$. How do I combine these two distributions to find the combined probability that one random variable is lower than the other. As in, how do I find $P(x < y)$?
I'm familiar enough with statistics to be able to calculate this for discrete random variables like dice throws, by going through each possible outcome and summing up the probabilities for each one. I can do this because the number of outcomes are finite, but I can't apply this method to a continuous distribution. What's the proper mathematical way of doing this?
Let $Z=X-Y$.
$P(X-Y < 0) = P(X-Y\le 0) - P(X -Y = 0) = F_Z(0) - P(X -Y = 0) = P(Z \le 0) - P(X -Y = 0)$
For continuous distributions $P(X -Y = 0) = 0$ (Why?)
To compute $P(Z = X - Y < 0)$, we have a double integral:
$$\int \int_{A} f_{X,Y}(x,y) dxdy$$
So what is A, the area of the integration (represented by bounds of integration)?
Edit: the area of integration (if dx, dy) is
$$\int_{-\infty}^{\infty} \int_{-\infty}^{y} f dx dy$$
The reader is invited to review Calculus to determine why that is so and as an exercise to find the bounds if dy, dx instead.
Draw the xy-plane and the line $X=Y$. Shade in the area that corresponds to $X<Y$. What are the bounds of integration that represent this double integral?
Edit: I assume the existence of $f_{X,Y}$, the joint density, which equals $p_X p_Y^{[1]}$, the product of the pdfs of X and Y, assuming X and Y are independent and have pdfs.
$[1] p_x = P'(x), p_y = P'(y)$