Independence of $X$ and $Y_1$ and independence of $X$ and $Y_2$ implies independence of $\sigma\{X\}$ and $\sigma\{Y_1,Y_2\}$?

131 Views Asked by At

$X,Y_1,Y_2$ are random variables. Suppose $X$ and $Y_1$ are independent, and $X$ and $Y_2$ are independent. Then by definition we have: $$Pr[X \leq x, Y_1\leq y_1] = Pr[X \leq x]Pr[Y_1\leq y_1],$$ $$Pr[X \leq x, Y_2\leq y_2] = Pr[X \leq x]Pr[Y_2\leq y_2].$$ I was wondering if the following equation holds: $$Pr[X \leq x, Y_1\leq y_1, Y_2\leq y_2] = Pr[X \leq x]Pr[Y_1\leq y_1, Y_2\leq y_2].$$

Actually, what I want to show is that if $\sigma$-algebras $\sigma\{X\}$ and $\sigma\{Y_1\}$ are independent, and $\sigma\{X\}$ and $\sigma\{Y_2\}$ are independent, then $\sigma\{X\}$ and $\sigma\{Y_1,Y_2\}$ are independent (use $\pi-\lambda$ theorem).


It is useful to calculate conditional expectations of joint Gaussian random variables $\mathbf{E}[X|Y_1,...,Y_n]$. Here $X$ and $Y_1$ to $Y_n$ are joint Gaussian.

Set $\mathbf{E}[X|Y_1,...,Y_n] := Z = a + b_1Y_1+...+b_nY_n$. Set $\mathbf{E}[X-Z]=0$ and $\mathbf{E}[(X-Z)Y_i]=0$ to solve $a$ and $b_1$ to $b_n$. Then $X-Z$ and $Y_i$ are independent. Therefore(?) $\sigma\{X-Z\}$ and $\sigma\{Y_1,...,Y_n\}$ are independent. We have $\mathbf{E}[X|Y_1,...,Y_n] = Z$.

1

There are 1 best solutions below

3
On BEST ANSWER

Example: Toss two fair dice. Let $Y_i$ be the number showing on die $i$, $i=1,2$. Let $X$ be $1$ or $0$ depending on whether the sum of the two dice is even or odd. Then $Y_1$ is independent of $X$, $Y_2$ is independent of $X$, but $X$ is not independent of the pair $(Y_1,Y_2)$.

For joint Gaussian: What has to be true about the covariance matrix of $(X,Y_1,Y_2)$ for $X$ to be independent of $(Y_1,Y_2)$?