Let $(X_1, Y_1)$, and $(X_2, Y_2)$ be two pairs of random variables, and they are assumed to be independent.
Does it mean that:
- $X_1$ is independent from $X_2$?
- $X_1$ is independent from $Y_2$?
- $Y_1$ is independent from $X_1$?
- $Y_1$ is independent from $Y_2$?
Motivation: I had this question because in a lot of applications in statistics, you would like to find a parameter such that the joint distribution
$$P_{Y_1, Y_2}(y_1, y_2| x_1, x_2; \theta)$$ is maximized. And there is often the assumption that $(X_1, Y_1)$, and $(X_2, Y_2)$ are independent and identically distributed.
Using this assumption , $$P_{Y_1, Y_2}(y_1, y_2| x_1, x_2; \theta) = P_{Y_1}(y_1| x_1; \theta)P_{Y_1}(y_2| x_2; \theta) = \prod_{i = 1}^2 P_{Y_i}(y_i| x_i; \theta)$$
However, for this calculation to work, you must be able to show,
$$P_{Y_1, Y_2}(y_1, y_2| x_1, x_2; \theta) = \dfrac{\Pr(Y_1 = y_1, Y_2 = y_2, X_1 = x_1, X_2 = x_2; \theta)}{\Pr(X_1 = x_1, X_2 = x_2; \theta)}$$
which means you must split the denominator $\Pr(X_1 = x_1, X_2 = x_2; \theta) = \Pr(X_1 = x_1; \theta)\Pr(X_2 = x_2; \theta)$. However, the independence of $X_1, X_2$ is not explicitly stated. So it is implicit in the assumption? Or is an assumption missing?
YES, YES, NO and YES. If two sigma algebras are independent then any sub-sigma algebras of these are also independent. This proves 1) 2) and 4).
For 3) take two independent (non-constant) random variables $U$ and $V$ and take $X_1=Y_1=U, X_2=Y_2=V$.