Dealing with dependent random variables using Measure Theory

207 Views Asked by At

$\newcommand{\E}{\mathbb{E}} \newcommand{\PM}{\mathbb{P}}$ This question is inspired by this question where the user asks how to prove that $\E[XI_E]=0$ if $X$ is integrable and $E$ has zero probability measure. It already has an answer, but I was thinking about something alternative, but I was not sure about it. Here I will share my process of thoughts.

First Thoughts. First I thought that the proof was easy because they are either independent or dependent. If they are independent it is trivial. If they are dependent then we must have them both in the same probability space $(\Omega,\mathcal F, \PM)$ hence: \begin{align}\tag{1} \E[XI_E]=\int_\Omega XI_E\, d\PM = \int_E X\,d\PM = 0 \end{align} Because integrating over a null set gives zero.

Second Thoughts. It cannot be that easy. The case I have written above when they are dependent is when they are fully dependent. So they can be either fully dependent and not fully dependent. For the second case we have them in seperate probability spaces $X$ in $(\Omega_1, \mathcal F _1,\PM_1)$ and $I_E$ in $(\Omega_2, \mathcal F_2, \PM_2)$. Note that these two probability spaces can be identical. We also have the product probability space $(\Omega_1\times \Omega_2 , \mathcal F_1\otimes \mathcal F_2, \PM)$. But then we have: \begin{align} \E[XI_E] = \iint_{\Omega_1\times\Omega_2} X(\omega_1)I_E(\omega_2)\, d\PM(\omega_1,\omega_2) =\iint_{\Omega_1\times\Omega_2} XI_E\,d\PM \end{align} This can be dealt with simple functions ($\star$).

Question. Are my thoughts good? Especially the fully dependency case and not fully dependency case. If what I have written is wrong, how can I deal with dependent random variables using Measure Theory in this case ? (General case is also welcome of course!)


Elaboration of $(\star)$ for the case my thoughts are good. There exists a simple function $Y_n$ that converges monotone increasing pointwise to $Y=|X|$. For a simple function we have: $Y_n = \sum_{i=1}^N a_iI_{A_i}$ and hence: \begin{align} \iint_{\Omega_1\times\Omega_2} Y_n I_E \,d\PM = \sum_{i=1}^Na_i\PM(A_i\times E) \end{align} We have $\PM(A_i\times E)\leq \PM(\Omega_1\times E) = \PM_2(E)=0$. So: \begin{align} \iint_{\Omega_1\times\Omega_2} Y_n I_E \,d\PM = 0 \ \ \ \forall_{n\in\mathbb{N}} \end{align} And the claim follows by MCT.


Edit. I followed the suggestion of @Did, namely to grab the book and read it. I see where I went wrong. In my lecture notes when discussing independency we had two random variables $X:(\Omega,\mathcal F)\to (\Omega_1,\mathcal F_1)$ and $Y:(\Omega,\mathcal F)\to (\Omega_2,\mathcal F_2)$. I was messing up with the domain and the target space. The domain is always the same when talking abot the components of a random vector, but the targer may differ. But now everything is indeed easy since we can write equation (1) aaaand we are done.