Proof for identity involving joint probability and conditional probability.

40 Views Asked by At

How do you prove the following identity? $$\mathbb{P}(X \in A, Y \in B) = \int_B \mathbb{P}(X \in A| Y = y)\mathbb{P}_Y(dz)$$ Additionally, what assumptions on $X$, $Y$, $A$ and $B$ are needed?

1

There are 1 best solutions below

0
On BEST ANSWER

Assume for instance that $(X,Y)$ has a joint probability distribution $f$ and let $f_Y$ be the marginal disrtibution of $Y$. Suppose moreover that $f_Y(y) >0$ for all $y\in B$. In this case for any measurable function $\psi$ such that $\psi(X)$ is integrable, conditioning on $\{Y=y\}$ is defined as: \begin{align} \mathbb{E}\left(\psi(X) | Y = y\right) &:=\int \psi(x)\frac{f(x,y)}{f_Y(y)}dx\\ &:=\int \psi(x)f_{X|Y=y}(x)dx. \end{align} In particular, $\mathbb{E}\left(\psi(X) | Y = y\right)$ is a function of $y$. Now, observe that: \begin{align} \mathbb{P}(X \in A, Y \in B) &= \mathbb{E}(1_{X \in A}1_{Y \in B})\\ &=\mathbb{E}(\mathbb{E}(1_{X \in A}1_{Y \in B}|Y)) \\ &=\mathbb{E}(1_{Y \in B}\mathbb{E}(1_{X \in A}|Y))\\ &=:\mathbb{E}(1_{Y \in B}\phi(Y)) \end{align} where $\phi : y \mapsto \phi(y) = \mathbb{E}(1_{X \in A}|Y=y)$. It follows that: \begin{align} \mathbb{P}(X \in A, Y \in B) &= \mathbb{E}(1_{Y \in B}\phi(Y))\\ &= \int_B \phi(y) \mathbb{P}_Y(dy)\\ &= \int_B \mathbb{E}(1_{X \in A}|Y=y) \mathbb{P}_Y(dy)\\ &= \int_B \mathbb{P}(X \in A|Y=y) \mathbb{P}_Y(dy).\\ \end{align}