Joint PDF such that uncorrelated but dependent

490 Views Asked by At

We have two uniform random variables, i.e, $X \sim U[0, A]$ and $Y \sim U[0, A]$. I would like to develop a joint pdf such that $X$ and $Y$ are uncorrelated but dependent. I think the joint pdf should involve some kind of symmetry which results in zero correlation. I tried through trial and error. I didn't find any solution. How should I approach the problem?

2

There are 2 best solutions below

0
On BEST ANSWER

Let $A=1$ since this is a scale parameter only. A simple trick is to spoil the joint density of two independent variables by adding a product of some functions $g(x)h(y)$, where both functions have zero integrals over $(0,\,1)$ so that thier product disappears when integrating over each variable. One need also that integral of this function multiplied by $(x-0.5)(y-0.5)$ disappears so that the covarince remains the same. Say, one can consider $$ f_{X,Y}(x,y) = \biggl(1+\cos\bigl(4\pi(x-0.5)\bigr)\cos\bigl(4\pi(y-0.5)\bigr)\biggr)1_{\{0< x < 1,\ 0 < y < 1\}}. $$ This is joint pdf since $f_{X,Y}(x,y)\geq 0$ and $$ \int_0^1\int_0^1 \cos\bigl(4\pi(x-0.5)\bigr)\cos\bigl(4\pi(y-0.5)\bigr)\,dxdy =0, $$ and marginal pdf's are still uniform since both iterated integrals are zero too. And also $$ \mathbb E\left[X-0.5\right]=\int_0^1 (x-0.5)\,dx+ \int_0^1 (x-0.5)\cos\bigl(4\pi(x-0.5)\bigr)\,dx=0=\mathbb E\left[Y-0.5\right], $$ so $\mathbb E[X]=0.5$, $\mathbb E[Y]=0.5$. The covariance is zero too: \begin{multline} \mathrm{cov}(X,Y)=\mathbb E\left[(X-0.5)(Y-0.5)\right] \cr =\int_0^1\int_0^1 (x-0.5)(y-0.5)\cos\bigl(4\pi(x-0.5)\bigr)\cos\bigl(4\pi(y-0.5)\bigr)\,dxdy =0, \end{multline} but $X$ and $Y$ are dependent.

You can also find more simple example considering piecewise-linear functions $g$ and $h$. Say, the function $$ g(x)=\begin{cases}-1, & x\in (0,\ 0.25)\cup (0.75,\ 1)\cr 1, & x\in [0.25,\ 0.75]\cr 0 & x\not \in(0,\ 1)\end{cases} \text{ and } h(y)=g(y) $$ satisfy all the above properties.

The joint pdf looks as follows: $$ f_{X,Y}(x,y)=1_{\{0< x < 1,\ 0 < y < 1\}}+g(x)h(y) $$ The values of joint pdf see on the picture:

Joint pdf values

The dependence of $X$ and $Y$ is obvious since there exist areas of zero joint probability.

0
On

The two pictures below are taken from the WP page on dependent uncorrelated normal random variables. Although they are concerned with a different situation (normal instead of uniform distributions), each of them suggests a simple solution to your problem.

First solution:
enter image description here
Thus... consider $X$ uniform on $(0,A)$, $Z$ Bernoulli independent of $X$ with $P(Z=1)=\frac12$, $P(Z=0)=\frac12$, and $$Y=ZX+(1-Z)(A-X)$$ In words, $Y$ is either $X$ or $A-X$, each with probability $\frac12$, independently of the value of $X$.

Then $Y$ is uniform on $(0,A)$, $(X,Y)$ is dependent, and $E(XY)=E(X)E(Y)$ hence $(X,Y)$ is uncorrelated.

Second solution:
enter image description here
Your turn: can you find some formulas describing $X$ and $Y$ uniform on $(0,1)$ corresponding to this second picture, and, even more to the point, can you show that this distribution of $(X,Y)$ solves your question?