Implications of conditional independence between random variables

76 Views Asked by At

Consider two probability spaces $(\mathcal{I}, \mathbb{P}, \mathcal{F})$, $(\mathcal{J}, \mathbb{P}, \mathcal{G})$. Take any $a\in \mathbb{R}$.

In the notation below, $1\{\cdots\}$ is $1$ if the condition inside is satisfied and $0$ otherwise.

Consider the following random variables:

1) $\forall i \in \mathcal{I}$, $e^i: \mathcal{J}\rightarrow \mathbb{R}$

2) $Z: \mathcal{J}\rightarrow \mathcal{Z}\subseteq \mathbb{R}$, with $\mathcal{Z}$ finite

3) $W: \mathcal{I}\rightarrow [0,1]$, where $W(i)\equiv \mathbb{P}\Big(\{j\in \mathcal{J} \text{ s.t. } e^i(j)\leq a\}\Big)$

4) $Q: \mathcal{I}\rightarrow \{0,1\}$, where $Q(i)\equiv 1\{W(i)>0\}$

5) $\forall z \in \mathcal{Z}$, $W_z: \mathcal{I}\rightarrow [0,1]$, where $W_z(i)\equiv \mathbb{P}\Big(\{j\in \mathcal{J} \text{ s.t. } e^i(j)\leq a\}\Big| \{j\in \mathcal{J} \text{ s.t. } Z(j)=z\}\Big)$

6) $\forall z \in \mathcal{Z}$, $Q_z: \mathcal{I}\rightarrow \{0,1\}$, where $Q_z(i)\equiv 1\{W_z(i)>0\}$

Assume that for some $z\in \mathcal{Z}$ $$ E\Big(W\Big)= E\Big(W_z\Big) $$ where $E$ denotes expectation.

Does this imply $$ E\Big(Q \Big)= E\Big(Q_z\Big) \text{ ?} $$

I've done some simulations and it seems that the answer is no but I would like some help to formalise this. Also, are we using somewhere that $\mathcal{Z}$ is finite and that the two probability spaces have the same probability measure?