Knowing the distribution of $(X,Y)$ implies knowing the distribution of $(X,Y, X-Y)$?

108 Views Asked by At

Consider a random vector $(X,Y)$ and suppose we know the joint probability distribution $P$ of $(X,Y)$.

Does this mean that we know (although maybe hard to derive analytically in many cases) the joint probability distribution of $(X,Y, X-Y)$? I understand that the marginals are all known, but I have doubts on the joint.

Can you provide an example to corroborate your answer, e.g., $$ (X,Y)\sim \mathcal{N}((0,0), \begin{pmatrix} \sigma^2_1 & \sigma_{12}\\ \sigma_{12} & \sigma^2_2\\ \end{pmatrix}) $$

2

There are 2 best solutions below

2
On BEST ANSWER

If $f$ is the (continuous) mapping $(x,y)\to(x,y,x-y)$ from $\Bbb R^2$ to $\Bbb R^3$, then the joint distribution of $(X,Y,X-Y)$ is the probability measure $Q$ given by $Q(B)=P(f^{-1}(B))$, for Borel sets $B\subset\Bbb R^3$. This provides a rather abstract YES answer to your question.

In your normal example, because $f$ is linear, the vector $(X,Y,X-Y)$ has a (degenerate) joint normal distribution, with zero means and covariance matrix $$ \left[\matrix{\sigma_1^2&\sigma_{12}&\sigma_1^2-\sigma_{12}\cr \sigma_{12}&\sigma_2^2&\sigma_{12}-\sigma_2^2\cr \sigma_1^2-\sigma_{12}&\sigma_{12}-\sigma_2^2&\sigma_1^2+\sigma_2^2-2\sigma_{12}\cr}\right]. $$ (Degenerate because this covariance matrix has rank at most 2.)

0
On

Yes, let $Z=X-Y$. Then,

$p(X=x, Y=y, Z=z) = \begin{cases} p(X=x, Y=y), \text{ if } z=x-y \\ 0, \text{ otherwise}\end{cases}$