Proof of $ E(XY) = E(X) E(Y) $

4.4k Views Asked by At

When two random variables are statistically independent, the expectation of their product is the product of their expectations.

I found this on wikipedia : https://en.wikipedia.org/wiki/Product_distribution

$$ E(XY) = E(X) E(Y) $$

Nevertheless, I can't find a simple proof. I think that in my probability books, they are skipping a few steps writting :

$$ E(XY) = \sum_{x \in D_1 } \sum_{y \in D_2} xy P(X = x) P(Y=y) $$ I don't understand this equality because I was thinking that $$ E(XY) = \sum_{x \in D_1 } \sum_{y \in D_2} z P(XY = z) $$ is true (i guess), and I don't know how to go from the second equality to the first one.

The link to the wikipedia page gives another proof using conditional expectency, which is something I'm rather unfamiliar with. Could you please give me a simple explanation/proof of the link between the two Latex lines ?

Thank you !

2

There are 2 best solutions below

4
On BEST ANSWER

As you do in your problem, we shall restrict ourselves to the discrete case, where $X$ and $Y$ are random variables taking at most countably infinitely many possible values. Let us call this set of values $S$. Let $T$ denote the set of products of two elements in $S$. Then $XY$ is a discrete random variable taking values in the set $T$.

Now suppose I ask you what the probability of $XY$ taking the value $z$ is. Well clearly if $X = x$, then $Y$ must equal $z/x$. And since $X$ must take some value, we must have that

$$ P(XY=z) = \sum_{x \in S} P(X = x \text{ and } Y = z/x). $$

Now, since $X$ and $Y$ are independent, we have that $P(X = x \text{ and } Y = z/x) = P(X=x)P(y=z/x)$. Thus

$$ P(XY=z) = \sum_{x \in S} P(X = x) P(Y = z/x). $$

By definition of expectation,

$$ E(XY) = \sum_{z \in T} zP(XY=z). $$

Using our expression we just computed for $P(XY=z)$, we substitute

$$ E(XY) = \sum_{z \in T} \sum_{x \in S} zP(X=x)P(Y=z/x). $$

Note that if $z/x$ is not in our set $S$, $P(Y = z/x) = 0$, so we may simplify the above summation to

$$ E(XY) = \sum_{z/x \in S} \sum_{x \in S} zP(X=x)P(Y=z/x). $$

Now make the substitution $y = z/x$. Noting that $z = x\cdot z/x = xy$, we conclude

$$ E(XY) = \sum_{y \in S} \sum_{x \in S} xyP(X=x)P(Y=y). $$

From here, we factor

$$ E(XY) = \sum_{y \in S} \sum_{x \in S} xyP(X=x)P(Y=y) = \left(\sum_{x \in S}xP(X=x) \right)\left(\sum_{y \in S}yP(Y=y) \right) = E(X)E(Y), $$

and the desired result has been obtained.

0
On

Here is a rough outline (not very rigorous, perhaps) -

$$\mathbb{E}[XY] = \sum_z zP(XY=z)$$ $$ = \sum_{y,z} zP(X=z/y|Y=y)P(Y=y) $$ $$= \sum_{y,z} zP(X = z/y)P(Y=y)$$ $$= \sum_{x,y:xy=z} xyP(X=x)P(Y=y)$$

From the 2nd to the 3rd line, I used the independence of $X$ and $Y$ to claim that $P(X=a|Y=b) = P(X=a)$.