Sum of two sequences of independent random variables

47 Views Asked by At

Given the mutually independent rv $X_1,X_2,Y_1,Y_2$, I want to show that $(X_1+Y_1),(X_2+Y_2)$ are independent as well. Is it sufficient to show that $$E((X_1+Y_1)(X_2+Y_2))=E(X_1+Y_1)E(X_2+Y_2)$$?

2

There are 2 best solutions below

6
On BEST ANSWER

The usual proof applies measure theory. Using the notion of independent $\sigma$-algebras, you can prove that if

$$\{X_i:i \in I\}$$ is a collection of independent random variables, and $\{I_j: j \in J\}$ is a collection of disjoint subsets of $I$, then the collections of random variables

$$\{X_i: i \in I_j\}, j\in J$$

are also independent.

Putting $X_3:= Y_1, X_4:= Y_2$ and $I= \{1,2,3,4\}$, we get that

$$\{X_1, X_3\}, \{X_2,X_4\}$$

are independent collections. This means that $\sigma(X_1, X_3)$ and $\sigma(X_2, X_4)$ are independent. Since

$$\sigma(X_1 + X_3) \subseteq \sigma(X_1, X_3)$$ $$\sigma(X_2 + X_4) \subseteq \sigma(X_2, X_4)$$

we also get that the $\sigma$-algebras $\sigma(X_1+X_3)$ and $\sigma(X_2 + X_4)$ are independent. This means that the random variables

$$X_1+X_3 = X_1+Y_1$$

and $$X_2 + X_4 = X_2 + Y_2$$

are independent.

0
On

An elementary proof of independence is a bit lengthy but if you are familiar with characteristic functions you can prove this easily: You only have to show that $Ee^{i[t(X_1+Y_1)+s(X_2+Y_2)]}=Ee^{i[t(X_1+Y_1)]}Ee^{is[X_2+Y_2)]}$ for all real numbers $t$ and $s$. Can you show this?