I'm trying to follow a line in a derivation for $P(Z>X+Y)$ where $X,Y,Z$ are independent continuous random variables distributed uniformly on $(0,1)$.
I've already derived the pdf of $X+Y$ using the convolution theorem, but there's a line in the answer that says:
$P(Z>X+Y) = \mathbb{E}[\ P(Z>X+Y\ |\ X+Y )\ ]$ where $\mathbb{E}$ is the expectation.
I'm not familiar with this result. Could anyone give a pointer to a similar result if one exists?
Thanks.
$$\mathbb{P}(Z>X+Y)=\mathbb{E}[\mathbb{1}(Z>X+Y)]=\mathbb{E}[\mathbb{E}[\mathbb{1}(Z>X+Y)|X+Y]]=\mathbb{E}[\mathbb{P}(Z>X+Y|X+Y)],$$ where second equality is the following property of conditional expectation: $$\mathbb{E}[\mathbb{E}[X|Y]]=\mathbb{E}[X]$$ Intuitively, now that you know distribution of $X+Y$, you just need to "range"$^1$ through the values of $X+Y$, and find the probability of $Z>X+Y$ for each such value. This is exactly the expectation of the probability.
$^1$integrate against the density, i.e. $\int_0^2\mathbb{P}(Z>v)f_{X+Y}(v)\;dv$