Integrating with respect to marginal distribution

76 Views Asked by At

Let $(\Omega,\mathcal F, P)$ be a probability space and let $X$ and $Y$ denote continuous $\mathbb R^d$-valued random variables. The pushforward measures of $P$ under $X$ and $Y$ are denoted by $P_X$ and $P_Y$, respectively.

The marginal distribution of the $i$th component of $X$ and $Y$ are defined as $P_{X_i} := P_X\circ\Pi^*_{\{i\}}$ and $P_{Y_i} := P_Y\circ\Pi_{\{i\}}^*$. Here I defined $\Pi_{\{i\}}^*$ as a function from $\mathcal P(\mathbb R^d)$ to $\mathcal P(\mathbb R)$ such that $\Pi_{\{i\}}^*(A) = \{z\in\mathbb R^d : \langle e_i, z\rangle\in A\}$ for $A\subseteq\mathbb R$, i.e. $\Pi_{\{i\}}^*(A)$ is the inverse of inverse image of $A$ under the projection onto the $i$th component.

If $X$ and $Y$ are independent, there exists a probability space $(\Omega\times\Omega,\mathcal F\boxtimes\mathcal F, P_X\otimes P_Y)$, where $\mathcal F\boxtimes\mathcal F$ is the product $\sigma$-algebra and $P_X\otimes P_Y$ the product measure. Put $P:= P_X\otimes P_Y$. I want to compute $$\nu_i := P(X_i - Y_i < 0)$$ for $i=1,2,\dots, d$. My attempt: $$\begin{align*} \nu_i &= P(X_i - Y_i < 0) \\&= \int_{\Omega\times\Omega} 1_{\{X_i - Y_i < 0\}}\,\mathrm dP \\ &= \int_\Omega\int_\Omega 1_{\{X_i - Y_i < 0\}}\,\mathrm dP_X\,\mathrm dP_Y\end{align*}$$ where I used Fubini's theorem in the last step. Now I am stuck. My problem is that I integrate with respect to $P_X$ and $P_Y$, but in the indicator function I have the $i$th components $X_i$ and $Y_i$. So my idea was to replace $X_i-Y_i$ by $\langle e_i, X-Y\rangle$. But I don't see how this could help me.

Eventually I want to express $\nu_i$ as a Lebesgue-Stiltjes integral: $$\nu_i = \int F_{X_i}\,\mathrm dF_{Y_i}.$$ So my problem is essentially how to get rid of the $\mathrm dP_X$ and $\mathrm dP_Y$ and instead factor in $\mathrm dP_{X_{\color{red}i}}$ and $\mathrm dP_{Y_{\color{red}i}}$.

1

There are 1 best solutions below

5
On BEST ANSWER

A general formula is this: If $X$ and $Y$ are independent random variables then $$ P[X<Y] = E[h(Y)] $$ where $h:\mathbb{R}\rightarrow [0,1]$ is the nondecreasing (hence measurable) function defined by $$ h(x) = P[X<x] \quad \forall x \in \mathbb{R}$$


For a proof, you can use your existing steps with Fubini (using $d=1$ to avoid distractions) and then do $$ P[X<Y] = \int_{\Omega}\int_{\Omega} 1_{\{X<Y\}}dP_XdP_Y=\int_{\Omega}\underbrace{\left[\int_{\Omega} 1_{\{X<Y\}}dP_X\right]}_{h(Y)}dP_Y$$


If $X$ has a continuous CDF then $h(x)=F_X(x)$ for all $x \in \mathbb{R}$. However, you cannot replace $h(x)$ with $F_X(x)$ in general: For a counter-example, consider $X=Y=0$ with prob 1, so $P[X<Y]=0$, $E[h(Y)]=0$, but $E[F_X(Y)]=1$.