Understand a Principle of Independence from Probability Space

69 Views Asked by At

On my Probability class, the professor mentioned a principle called "principle of independence from probability space" without proving. I couldn't find this principle anywhere on the available textbooks.

Here is what it says:

Let $(\Omega, \Sigma, \mathbb{P})$ be some probability space and $X_1, \cdots, X_n$ be some random variables or vectors on this space. Suppose that $F$ is the joint c.d.f. of $X_1, \cdots, X_n$. Assume we want to confirm a statement of the form "$F$ satisfies $(*)$." Then if $(\tilde{\Omega}, \tilde{\Sigma}, \tilde{\mathbb{P}})$ is any other probability space and $\tilde{X_1}, \cdots, \tilde{X_n}$ random variables/vectors with $\tilde{F} = F$, then it is sufficient to check for $\tilde{X_1}, \cdots, \tilde{X_n}$.

It seems reasonable, but how should one show it?

Furthermore, one example that my Professor used this principle on is to show the following:

Fix probability space $(\Omega, \Sigma, \mathbb{P})$. $$ \mathbb{E} YZ = \mathbb{E} Y \mathbb{E} Z $$ whenever $Y, Z$ are independent and $\mathbb{E}|YZ| < \infty$.

Here is the proof: Through principle of independence from probability space, choose product space $\Omega' = \{ (\omega_1, \omega_2) \} = \Omega \times \Omega$, product $\sigma$-field $\Sigma' = \Sigma \otimes \Sigma$ and product probability $\mathbb{P}' = \mathbb{P} \times \mathbb{P}$. Then \begin{align*} \mathbb{E} YZ &= \int_{\Omega'} Y(\omega_1)Z(\omega_2) \,d\mathbb{P} \\ &\stackrel{Fubini-Tonelli}{=} \int_{\Omega} Y(\omega_1)\left( \int_{\Omega} Z(\omega_2) \,d\mathbb{P}(\omega_2) \right) \mathbb{P}(\omega_1) \\ &= \mathbb{E}Z \mathbb{E}Y. \end{align*}

I do not see how the principle of independence from probability space is used here. It seems like some details are hidden in this proof, for example, what are the new random variables that we defined here which has the same joint c.d.f.? Is there a way one can extend this proof to include the details, so I could understand how the principle is used?

1

There are 1 best solutions below

1
On BEST ANSWER

The "proof" of this "principle of independence" is just that, if your question only depends on the distribution, then if $X$ has the same distribution as $Y$ it suffices to answer a question concerning $X$ using $Y$... because they have the same distribution, and equal things are equal. If I want to evaluate $\operatorname{Pr}(X\in A)$ for some event $A$, and $Y$ has the same distribution as $X$, then by definition it's true that $\operatorname{Pr}(X\in A)=\operatorname{Pr}(Y\in A)$, so, I can just think about the 'model' $Y$ if it's easier to do so.

In applying the 'principle of independence' to the problem of $\Bbb E[ZY]=\Bbb E[Z]\Bbb E[Y]$, you want to show that $ZY:\Omega\to\Bbb R$ has the same distribution as $(ZY)':\Omega'\ni(\omega_1,\omega_2)\mapsto Y(\omega_1)Z(\omega_2)\in\Bbb R$. To do so, it suffices to check that $(Z,Y):\Omega\ni\omega\mapsto(Y(\omega),Z(\omega))\in\Bbb R^2$ and $(Z,Y)':\Omega'\ni(\omega_1,\omega_2)\mapsto(Y(\omega_1),Z(\omega_2))\in\Bbb R^2$ have the same distribution; it further suffices to check this on products $U\times V$ where both $U,V$ are open, since this generates the topology on $\Bbb R^2$ and thus the Borel $\sigma$-algebra, which is very straightforward.

Then, since there is an equality of distributions $(ZY)_\ast\mathbb{P}=((ZY)')_\ast\mathbb{P}$, we have:

$$\Bbb E[ZY]=\int_{\Bbb R}x\,\,\mathrm{d}(ZY)_\ast\mathbb{P}(x)=\int_{\Bbb R}x\,\,\mathrm{d}(ZY)'_\ast\mathbb{P}(x)=\Bbb E[(ZY)']$$

So, it suffices to 'solve the problem' for the variable $(ZY)'$.