Uncorrelated Random Variables

79 Views Asked by At

If $X_1, X_2$ are $2$ random variables such that $(X_1, X_2)$ and $(-X_1, X_2)$ have the same joint distributions then show that $X_1$ and $X_2$ are uncorrelated.

I know that to be uncorrelated the $Cov(X_1, X_2) = E(X_1X_2)-E(X_1)E(X_2) = 0$

which implies $E(X_1X_2)=E(X_1)E(X_2)$

But how do I proceed from here?

2

There are 2 best solutions below

0
On

Hint: Show that $E(X_1 \cdot X_2) = - E(X_1 \cdot X_2)= 0$ and $E(X_1)=-E(X_1)=0$.

3
On

If $(X_1,X_2)$ and $(-X_1,X_2)$ have the same joint distribution, and $f(y,z):\mathbb{R}^2\to\mathbb{R}$, then you'd expect $f(X_1,X_2)$ and $f(-X_1,X_2)$ to have the same distribution as well.

This gives you a few nice pieces of information:

  1. $X_1$ and $-X_1$ have the same distribution.
  2. $X_1X_2$ and $-X_1X_2$ have the same distribution.

Now, if two random variables have the same distribution, then they must have the same expectation. (Why?) Can you see where to go from here?