Multiplication of Expected Values

13.4k Views Asked by At

I am trying to understand the proof I tried that $E[X*Y] = E[X]*E[Y]$ for independant variables $X,Y$

In this I assign the the multiplicative expected value of X*Y to Z:

$E[Z] = E[X*Y]$

So I can write it like this, (which will try to bring into this form $E[X*Y ] = E[X]*E[Y]$):

$E[Z] = \sum_{\psi \in \Omega} Z(\psi) z* Pr[Z=z]$

Because I deal with independant variables here I can write it like this:

$E[Z] = \sum_{\psi \in \Omega} Z(\psi)* Pr[Y=y \land X = y]$ and then into

$E[Z] = \sum_{\psi \in \Omega} X(\omega)*Y(\phi)* Pr[Y=y] * Pr[X = y]$

Since $\psi$ (and this is where I am not sure but it needs make sense, can you explain why we do this ?) considers all pairs of $\phi_1\omega_1$ to $\phi_n\omega_n$ we can write the sum of all $\psi$ like

$E[Z] = \sum_{\omega \in \Omega} \sum_{\phi \in \Omega} X(\omega)*Y(\phi)* Pr[Y=y] * Pr[X = y]$

We can then pull out the constant factors in that nested sum like this and then we can see that the proof is complete

$E[Z] = \sum_{\omega \in \Omega}X(\omega)* Pr[X = x] * \sum_{\phi \in \Omega} Y(\phi)* Pr[Y=y] $

$E[Z] = (\sum_{\omega \in \Omega}X(\omega)* Pr[X = x])*( \sum_{\phi \in \Omega} Y(\phi)* Pr[Y=y] )$

$E[Z] = E[X] * E[Y]$

And the proof is complete at that point, I would like some better insight how $\psi$ was all pairs of $\omega $ and $\phi$

1

There are 1 best solutions below

0
On BEST ANSWER

Noting that $\{\{\psi\in\Omega: X(\psi)=x, Y(\psi)=y\}: x\in X(\Omega), y\in Y(\Omega)\}\setminus\emptyset$ is a partition of the sample space, $\Omega$ then:

$$\begin{align} \mathsf E(XY) ~&=~\sum_{\psi\in\Omega} X(\psi)Y(\psi)\Pr(\psi) &\because~&{\text{Definition of Expectation} \\ {\tiny\text{(for sample spaces of countable many outcomes)} }} \\[1ex] &=~ \sum_{x\in X(\Omega)}\sum_{y\in Y(\Omega)}\sum_{\psi:X(\psi)=x, Y(\psi)=y}X(\psi)\,Y(\psi)\,\Pr(\psi) &\because~&\text{Partitioning} \\[1ex] &=~ \sum_{x\in X(\Omega)}\sum_{y\in Y(\Omega)} x\,y\,\Pr\{\psi: X(\psi)=x,Y(\psi)=y\} &\because~& \text{Sigma Additivity of Probability} \\[1ex] &=~ \sum_{x\in X(\Omega)}\sum_{y\in Y(\Omega)} x\,y\,\Pr\{\psi:X(\psi)=x\}\Pr\{\psi:Y(\psi)=y\} &\because~&\text{Independence} \\[1ex] &=~ \sum_{x\in X(\Omega)}\sum_{y\in Y(\Omega)} x\,y\,\Pr\{\psi:X(\psi)=x\}\Pr\{\phi:Y(\phi)=y\} &\because~&\text{Alpha-Substitution} \\[1ex] &=~ \sum_{x\in X(\Omega)}x \Pr\{\psi:X(\psi)=x\}\cdot\sum_{y\in Y(\Omega)} y\Pr\{\phi:Y(\phi)=y\} \\[1ex] &=~ \sum_{\psi\in\Omega} X(\psi)\Pr(\psi)\cdot\sum_{\phi\in\Omega}Y(\phi)\Pr(\phi) &\because~&\text{Partitioning (Reversal)} \\[1ex] &=~ \mathsf E(X)\cdot\mathsf E(Y) &\because~&\text{Definition of Expectation} \end{align}$$