Uncorrelated but not independent uniform distribution

753 Views Asked by At

Let $X = (X_1, X_2)$ be uniform distributed on $\{(-1,0), (1,0), (0,-1), (0,1)\}$.

First of all I want to show that $X_1$ and $X_2$ are uncorrelated but not independent.

Secondly I thought about showing that $X_1 - X_2, X_1 + X_2$ are independent (and thus uncorrelated).

I have thought about it quite some time but I can't figure out a solution, help is appreciated!

2

There are 2 best solutions below

1
On BEST ANSWER

The pdf of $X=(X_1,\,X_2)$ is $$ f(x_1,x_2)=\begin{cases} \frac{1}{4} & \text{for } (x_1,x_2)\in Q=\{(-1,0), (1,0), (0,-1), (0,1)\}\\ 0 & \text{otherwise} \end{cases} $$ and the variable $X=(X_1,X_2)$ can be represented in tabular form $$ \begin{pmatrix} (X_1,X_2)\\ f(x_1,x_2) \end{pmatrix}= \begin{pmatrix} (-1,0) & (1,0) & (0,-1) & (0,1)\\ \frac{1}{4} & \frac{1}{4} & \frac{1}{4} &\frac{1}{4} \end{pmatrix} $$

$X_1$ and $X_2$ are uncorrelated when their correlation coefficient is zero $$ \rho(X_1,X_2)=\frac{\mathsf{Cov}(X_1,X_2)}{\sqrt{\mathsf{Var}(X_1)\mathsf{Var}(X_2)}} $$ that is if their covariance is zero: $$\mathsf{Cov}(X_1,X_2)=\mathbb E(X_1X_2) − \mathbb E(X_1)\mathbb E(X_2)=0$$

So you have $$ \mathbb E(X_1X_2)=\sum_{(x_1,x_2)\in Q} x_1x_2 f(x_1,x_2)=\frac{1}{4}\sum_{(x_1,x_2)\in Q} x_1 x_2=0 $$ and $$ \mathbb E(X_1)=\sum_{x_1\in Q_1} x_1 f(x_1,x_2)=\frac{1}{4}\sum_{x_1\in \{-1,0,1\}} x_1 =0 $$ and the same for $\mathbb E(X_2)=0$: $$ \mathbb E(X_2)=\sum_{x_2\in Q_2} x_2 f(x_1,x_2)=\frac{1}{4}\sum_{x_2\in \{-1,0,1\}} x_2 =0 $$ So we have $$ \mathsf{Cov}(X_1,X_2)=0 $$ and the variable are uncorrelated.

They are not independent: $$ f_{X_1}(x_1)=\sum_{x_2\in Q_2} f(x_1,x_2)=\pmatrix{-1 & 0 & 1\\ \frac{1}{4} & \frac{1}{2} & \frac{1}{4}} $$ and $$ f_{X_2}(x_2)=\sum_{x_1\in Q_1} f(x_1,x_2)=\pmatrix{-1 & 0 & 1\\ \frac{1}{4} & \frac{1}{2} & \frac{1}{4}} $$ and obviously $f(x_1,\,x_2)\neq f_{X_1}(x_1)f_{X_2}(x_2)$ as you can see from the tabular representation $$ \begin{array}{cc|ccc|cc} &&&X_2 &&&\\ &f(x_1,x_2) & -1 & 0 & 1 & f_{X_1}(x_1)\\ \hline &-1 & 0 & \frac{1}{4} & 0 & \frac{1}{4}\\ X_1&0 & \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{2}\\ &1 & 0 & \frac{1}{4} & 0 & \frac{1}{4}\\ \hline &f_{X_2}(x_2)&\frac{1}{4} & \frac{1}{2} & \frac{1}{4} & \end{array} $$ It's easy to find that for $Y=X_1-X_2$ we have $$f_Y(y)=\sum_{x_1,x_2|x_1-x_2=y} f(x_1,x_2)=\sum_{x_1\in\{-1,0,1\}} f(x_1,x_1-y)$$ that is $$ \begin{pmatrix} Y\\ f_Y(y) \end{pmatrix}= \begin{pmatrix} -1 & 1\\ \frac{1}{2} & \frac{1}{2} \end{pmatrix} $$ In the same way for $Z=X_1+X_2$ we have $$f_Z(z)=\sum_{x_1,x_2|x_1+x_2=z} f(x_1,x_2)=\sum_{x_1\in\{-1,0,1\}} f(x_1,z-x_1)$$ that is $$ \begin{pmatrix} Z\\ f_Z(z) \end{pmatrix}= \begin{pmatrix} -1 & 1\\ \frac{1}{2} & \frac{1}{2} \end{pmatrix} $$ For the joint distribution of $(Y,\,Z)$ we have $$ f_{Y,Z}(y,z)=\Bbb P(Y=y,\,Z=z)=\Bbb P(X_1-X_2=y,\,X_1+X_2=z) $$ for $(y,z)\in\{(-1,-1), (-1,1),(1,-1),(1,1)\}$. So for example $$f_{Y,Z}(-1,-1)=\Bbb P(X_1=-1,\,X_2=0)=f(-1,0)=\frac{1}{4}$$ and so on. Thus we have $$ \begin{pmatrix} (Y,Z)\\ f_{Y,Z}(y,z) \end{pmatrix}= \begin{pmatrix} (-1,-1) & (-1,1) & (1,-1) & (1,1)\\ \frac{1}{4} & \frac{1}{4} & \frac{1}{4} &\frac{1}{4} \end{pmatrix} $$ that is $(Y,Z)$ is uniformly distributed over $Q'=\{(-1,-1), (-1,1),(1,-1),(1,1)\}$ and it is obvious that $$ f_{Y,Z}(y,z)=f_{Y}(y)f_{Z}(z) $$ that is $Y=X_1-X_2$ and $Z=X_1+X_2$ are independent (and, as a consequence, uncorrelated).

0
On

$X_1 X_2=0$ so $E[X_1 X_2]=0$ and thus, since $E[X_1]=E[X_2]=0$, the covariance and correlation must be zero.

But, for example, $X_1=1 \implies X_2=0$ so they are not independent