Proof in Georgii of $\mathbb E(XY) = \mathbb E(X) \mathbb E(Y)$ for independent random variables

99 Views Asked by At

I'm reading "Introduction to Probability and Statistics" by Georgii, and can't figure out how to formally show one of the steps of the proof of $\mathbb E(XY) = \mathbb E(X) \mathbb E(Y)$.

$\newcommand{\P}{\mathbb P} \newcommand{\E}{\mathbb E} $ Let $X,Y:(\Omega,F,\P)\to \mathbb R$ be independent random variables so that $X(\Omega)$ and $Y(\Omega)$ are at most countably infinite, and so that the expected value exists for both.
Let $X,Y$ be independent.

The proof (slightly altered) (Theorem 4.7 (d) in the book): $$ \begin{align} {}\E(XY) \\&= \sum_{z} z \P(XY=z) \\&=\sum_{z\neq 0} z \sum_{x\neq 0} \P(X=x,Y=z/x) \\&= \sum_{x\neq 0 , y\neq 0 } xy \P(X=x)\P(Y=y) \\&= \E(X)\E(Y) \end{align} $$

I'm having trouble with step: $$ \sum_{z\neq 0} z \sum_{x\neq 0} \P(X=x,Y=z/x) = \sum_{x\neq 0 , y\neq 0 } xy \P(X=x)\P(Y=y) $$

I guess the idea is that substituting $z/x$ by $y$: $$ \begin{align} &{}\sum_{z\neq 0} z \sum_{x\neq 0} \P(X=x,Y=z/x) \\&{}= \sum_{x\neq 0} \sum_{z\neq 0} z \P(X=x,Y=z/x) \\&{}= \sum_{x\neq 0} \sum_{\frac zx \neq \frac 0x} x\cdot \frac zx \P(X=x,Y=\frac zx) \\&{}= \sum_{x\neq 0} \sum_{y\neq 0} xy \P(X=x,Y=y) \end{align} $$

While it certainly looks obvious enough, I'm looking for a proof that precisely states the sets over which the sums run, and explains the equivalence I'm having trouble with.

A possible start: $$ \begin{align} &{}\E(XY) \\&= \sum_{z\in \mathbb R \\ \exists: x\in X(\Omega), y\in Y(\Omega): xy = z} z \P(XY=z) \\&= \sum_{z\in \mathbb R \\ \exists: x\in X(\Omega), y\in Y(\Omega): xy = z} z \sum_{x\in X(\Omega)\\ x\neq 0} \P(X=x,Y=z/x) \\ &= \end{align} $$

1

There are 1 best solutions below

4
On BEST ANSWER

For $z \neq 0$ we have:

$\mathbb{P}(XY=z)= \mathbb{P}(XY=z, \bigcup_{x \in X(\Omega)}\{X=x\})=\mathbb{P}( \bigcup_{x \in X(\Omega)}\{XY=z,X=x\})= $

$\sum_{x \in X(\Omega)}\mathbb{P}(XY=z,X=x)=\sum_{x \in X(\Omega), x \neq 0}\mathbb{P}(Y=z/x,X=x)$.

Since for $z=0$, we get $z\mathbb{P}(XY=z)=0$, we have:

$\sum_{z \in X(\Omega)Y(\Omega)}z\mathbb{P}(XY=z)= \sum_{z \in X(\Omega)Y(\Omega)}z \sum_{x \in X(\Omega), x \neq 0}\mathbb{P}(Y=z/x,X=x)$

where $X(\Omega)Y(\Omega)= \{ z \in \mathbb{R} : z=xy, x \in X(\Omega), y \in Y(\Omega) \}$

By Fubini's theorem, we can interchange the sums in the last term to get:

$\sum_{x \in X(\Omega), x \neq 0}\sum_{z \in xY(\Omega)}z \mathbb{P}(Y=z/x,X=x)= \sum_{x \in X(\Omega), x \neq 0}\sum_{\frac{z}{x} \in Y(\Omega)}\frac{z}{x}x \mathbb{P}(Y=z/x,X=x)$

$=\sum_{x \in X(\Omega), x \neq 0}\sum_{y \in Y(\Omega)}yx \mathbb{P}(Y=y,X=x)=\sum_{x \in X(\Omega)}\sum_{y \in Y(\Omega)}yx \mathbb{P}(Y=y,X=x)$.