Absolute probability via Conditional probability

1.6k Views Asked by At

I have recently seen the following equality:

For two uniformly (on $[0,1]$) distributed random variables $X_1$ and $X_2$, possibly dependent, \begin{align*} P(X_1\leq a, X_2\leq b) = \int_0^{b} F_s(P(X_1\leq a|X_2=s)) ds \end{align*} with $F_s$ being the conditional distribution of $X_1$ given $X_2 = s$.

I guess it should be trivial as it was just mentioned as a comment, but I don't see why it holds.

1

There are 1 best solutions below

8
On

More correct formulation: $$ P(X_1\leqslant a, X_2\leqslant b) = \int_0^{b} F_s(a)\mathrm ds,\qquad F_s(a)=P(X_1\leqslant a\mid X_2=s). $$ The Lebesgue measure $\mathrm ds$ on $[0,b]$ reflects the fact that $X_2$ is uniformly distributed on $[0,1]$. The distribution of $X_1$ is irrelevant.

To see why this hold, consider that, more generally, by definition of the conditional distribution, for every (integrable measurable) function $u$, $$ E(u(X_1,X_2))=\int_0^1\int_0^1u(r,s)\mathrm dF_s(r)\mathrm ds. $$ Apply this to $u_{a,b}=\mathbf 1_{[0,a]\times[0,b]}$ and use the fact that $$ \int_0^1u_{a,b}(r,s)\mathrm dF_s(r)=F_s(a)\mathbf 1_{s\in[0,b]}. $$