Suppose $X_1$ and $Y_1$, and $X_2$ and $Y_2$ have identical distributions too. What about $(X_1,X_2)$ and $(Y_1,Y_2)$?

84 Views Asked by At

Suppose $X_1$ and $Y_1$ and $X_2$ and $Y_2$ have identical distributions too. What about $(X_1,X_2)$ and $(Y_1,Y_2)$?

I could not prove that $(X_1,X_2)$ and $(Y_1,Y_2)$ have identical distribution, so I think the statement might be false.

Any hints towards the right direction?

I tried to prove it using the definition. Can we make the hypothesis stronger such that the statement becomes true? For example, add independence.

EDIT: If you add independence between $X_1$ and $X_2$, and between $Y_1$ and $Y_2$, the statement becomes true.

3

There are 3 best solutions below

3
On BEST ANSWER

$\def\deq{\stackrel{\mathrm{d}}{=}}$Suppose $X \sim U(0, 1)$ and $X_1 = X_2 = Y_1 = X$, $Y_2 = 1 - X$, then$$ X_1 \deq Y_1, \quad X_2 \deq Y_2, $$ but$$ (X_1, X_2) \not\deq (Y_1, Y_2). $$

1
On

Suppose that $X_1=0$ or $1$ each with probability $\frac12$ with $X_2=X_1, Y_1=X_1,Y_2=1-X_1$ so they all have Bernoulli distributions with parameter $\frac12$

Then $(X_1,X_2)=(0,0)$ or $(1,1)$ each with probability $\frac12$

but $(Y_1,Y_2)=(0,1)$ or $(1,0)$ each with probability $\frac12$

0
On

Keeping in mind the examples given in other answers, the intuition is simple: to know the distribution of two random variables is not enough to know their joint distribution unless they're independent.

In general, given two non-degenerate univariate distributions (say, with distribution functions $F_1$ and $F_2$), there are several (infinitely many, in general) different bivariate distributions—say with joint distribution function $F$—such that their marginal distributions are $F_1$ and $F_2$, that is $$\lim_{y\to+\infty}F(x,y)=F_1(x)\quad\text{and}\quad\lim_{x\to+\infty}F(x,y)=F_2(y).$$

Also, note that $F(x,y)=F_1(x)\cdot F_2(y)$ is always one such $F$, which corresponds to the case in which both variables are independent, but that does no have to be the only case. Even more, there are many ways in which two variables can be related, and each implies a specific joint distribution.

In particular, note that adding independence as a hypothesis do imply that both joint distributions are equal. This is also the case in some other very specific situations. For instance:

Let $(X_1,X_2)$ and $(Y_1,Y_2)$ be random vectors with bivariate normal distribution, and suppose that $X_1,Y_1\sim N(\mu_1,\sigma^2_1)$ and $X_2,Y_2\sim N(\mu_2,\sigma^2_2)$. Then, if $\text{cov}(X_1,X_2)=\text{cov}(Y_1,Y_2)$, it is the case that $(X_1,X_2)$ and $(Y_1,Y_2)$ have the same joint distribution.

(In any case, the previous hypotheses are strong, and the conclusion arises from the fact that normal bivariate distribution has only five parameters: both means and variances, and the covariance—or equivalently, the correlation coefficient.)