Joint expectation of independent random variables

29 Views Asked by At

Consider a function $g: \mathbb{R}^2 \to \mathbb{R}$. Let $D$ and $Z$ be random variables. As such, $g(Z,D)$ is a random variable itself.

By the rule of iterated expectation, we have $$ \mathbb{E}_D [\mathbb{E}_Z[g(Z, D)]] = \mathbb{E}_Z[g(Z,D)|D] $$

Now, if $Z$ and $D$ are also independent, does the following hold? $$ \mathbb{E}_D [\mathbb{E}_Z[g(Z, D)]] = \mathbb{E}_{(D,Z)}[g(Z,D)] $$ where $(D,Z)$ is a random variable / vector and the expectation is over the joint distribution of $D$ and $Z$.

For discrete random variables, one could write, with $f$ being the probability density functions and the sums going over the support of the RVs: $$ \begin{align} \mathbb{E}_D [\mathbb{E}_Z[g(Z, D)]] &= \sum_d \mathbb E_Z[g(Z, d)] f(d) \\ &= \sum_d \sum_z g(z,d) f(z)f(d) \\ &= \sum_{(d,z)} g(z,d) f(z,d) \\ &= \mathbb{E}_{(D,Z)}[g(Z,D)] \end{align} $$ Is that correct? Would the proof for continuous variables work similarly?