Finding the joint distribution of two dependent variables

7.8k Views Asked by At

Let $X$ and $Z$ be independent random variables with X uniformly distributed on $(−1, 1)$ and $Z$ uniformly distributed on $(0, 0.1)$. Let $Y = X^2 + Z$. Then $X$ and $Y$ are dependent.

Find the joint pdf of $X$ and $Y$.

The correct solution is $5$ for $-1<x<1$ and $x^2<y<x^2+.1$ and $0$ otherwise.

I do notice that this is the same as the joint distribution of $X$ and $Z$ since $X$ and $Z$ are independent, however I do not understand how they got $X$ and $Y$ to equal that.

I initially started trying to find the pdf of $Y$ which is the pdf of $X^2 + Z$ by finding the cdf of $(X^2 + Z \le a)$ and then taking the derivative to arrive at the the pdf. However, this method seems to involve finding the cdf for three sample spaces $(0<a<.1,\ \ .1<a<1, \ \ 1<a<1.1 )$ within a rectangle $(0,0), (0, .1), (1, .1), (1, 0)$ and leaves me with a pdf involving the variable $a$ for each sample space, which would 1. not fit the correct solution and 2. still leave me unable to derive the joint distribution from the marginal distribution, since they are dependent. What is the method used here?

2

There are 2 best solutions below

2
On

The right method is conditioning (on $X$), $$f_{X,Y}(x,y)=f_{Y|X}(y|x)f_X(x)$$ Given $X=x$, $Y=Z+x^2$, so the conditional PDF $f_{Y|X}(\cdot|x)$ is PDF $f_Z(\cdot)$ shifted right by $x^2$, $$f_{Y|X}(y|x)=f_Z(y-x^2)$$ and finally $$f_{X,Y}(x,y)=f_{X}(x)f_Z(y-x^2)$$

0
On

For your approach, you need to calculate the joint CDF $$P(X^2+Z < y, X<x)$$ instead of just $P(X^2+Z < y)$, and then take derivatives with respect to $y$ and $x$ to obtain $$f_{XY}(x,y) = \frac{\partial}{\mathrm{d}x}\frac{\partial}{\mathrm{d}y} P(X^2+Z < y, X<x)\,,$$ the answer you want.