Conditions on a bivariate distribution to be the distribution of $(X_1-X_0, X_1-X_2)$, $(X_2-X_0, X_2-X_1)$ and $(X_0-X_1, X_0-X_2)$

165 Views Asked by At

Consider a bivariate probability distribution $P: \mathbb{R}^2\rightarrow [0,1]$. I have the following questions:

Are there necessary and sufficient conditions on the cumulative distribution function (CDF) associated with $P$ (joint or marginal) ensuring that $$ \exists \text{ a random vector $(X_0,X_1,X_2)$ such that } $$ $$ (X_1-X_0, X_1-X_2), (X_2-X_0, X_2-X_1), (X_0-X_1, X_0-X_2) $$ $$ \text{ have all probability distribution $P$? } $$


Notice:

$(X_1-X_0, X_1-X_2)\sim (X_2-X_0, X_2-X_1)\sim (X_0-X_1, X_0-X_2)$ does not imply that some of the random variables among $X_1, X_2, X_0$ are degenerate. For example, $(X_1-X_0, X_1-X_2)\sim (X_2-X_0, X_2-X_1)\sim (X_0-X_1, X_0-X_2)$ is implied by $(X_0, X_1, X_2)$ exchangeable.


My thoughts: among the necessary conditions, I would list the following: let $G$ be the CDF associated with $P$ and let $G_1,G_2$ be the two marginal CDFs. Then it should be that $$ \begin{cases} G_1 \text{ is symmetric around zero, i.e., $G_1(a)=1-G_1(-a)$ $\forall a \in \mathbb{R}$}\\ G_2 \text{ is symmetric around zero, i.e., $G_2(a)=1-G_2(-a)$ $\forall a \in \mathbb{R}$}\\ \end{cases} $$

Are these conditions also sufficient? If not, what else should be added to get an exhaustive set of sufficient and necessary conditions?

1

There are 1 best solutions below

10
On

When you have a vector of random variables, or equivalently a random variable taking values in $\mathbb R^2$, we can write it as $(U,V)$ where $U$ is the $x$-coordinate and $V$ is the $y$-coordinate of the random vector. So $$G_1(u)=\mathbb P(U\le u),$$ $$G_2(v)=\mathbb P(V\le v).$$

Now, in general if $X$ and $Y$ are random variables and $F_X(x)=\mathbb P(X\le x)$, $F_Y(y)=\mathbb P(Y\le y)$, then we write $X\sim Y$ if $F_X=F_Y$.


Besides the conditions you give,

namely: if $(U,V)$ is a random variable on $R^2$ as desired then $U\sim -U$ and $V\sim -V$, where $\sim$ denotes "has the same distribution as",

there's also

$$V-U\sim (X_0-X_2)-(X_0-X_1)= X_1-X_2\sim V$$

And note that $V\sim -V$, $U\sim -U$ does not imply $V-U\sim V$, e.g., take $U$, $V$ to be independent standard normal $N(0,1)$ random variables: $$\mathrm{Var}(V-U)=\mathrm{Var}+\mathrm{Var}(U) = 2>1=\mathrm{Var}(V)$$