What goes wrong in the product distribution of two iid uniform random variables.

159 Views Asked by At

Let $X$ and $Y$ be two independent uniform $(0,1)$ random variables. Let $U=Y$ and $V=XY$. I would like to find the distribution of $V$.

Then $Y=U$ and $X=\frac{V}{U}$ so the Jacobian is $|J(u,v)|=\frac{1}{u}$. Thus $$f_{U,V}(u,v)=f_{X,Y}(x,y)|J(u,v)|=f_X(x)f_y(y)|J(u,v)|=1\cdot1\cdot\frac{1}{u}=\frac{1}{u}.$$

Now $U=Y$ so the range of $u$ is $[0,1]$. So we have $$f_V(v)=\int_0^1 \frac{1}{u} du = \log(u)\Big{|}_0^1$$


Now I know the right answer is $f_V(v)=-\log(v)$, but what I would like to know what goes wrong in the above proof. Perhaps is the range of $u$, but as $U=Y$ the range is $[0,1]$.

Thank you very much.

2

There are 2 best solutions below

2
On BEST ANSWER

Your mistake is in the argument of $f_X$ $$f_{U,V}(u,v)=f_{X,Y}(x,y)|J(u,v)|=f_{X}(v/u)f_Y(u)\frac1u$$

Hence, for any $v\in (0,1)$ \begin{align}f_V(v)&=\int_{u}f_{U,V}(u,v)du=\int_{0}^1f_{X}(v/u)f_Y(u)\frac1u\,du\\[0.2cm]&=\int_{0}^v 0\cdot 1\cdot\frac1u\,du+\int_{v}^11 \cdot 1\cdot\frac1u\,du=\log{(u)}\Big|_{v}^1=0-\log{(v)}\end{align}

0
On

The joint distribution of X and Y is given by:

$$f_{X,Y}(x, y) = I(0<x<1)I(0<y<1)$$

since:

$$f_{X}(x) = I(0<x<1), f_{Y}(y) = I(0<y<1)$$

where $I(\cdot)$ is the indicator function, assuming value $1$ if the condition within the brackets is met, $0$ otherwise.

From this, we obtain the joint ditribution of U and V being given by:

$$f_{U,V}(u, v) = f_{X,Y}(\frac{v}{u}, v)\mid J \mid = I(0<\frac{v}{u}<1)I(0<v<1) \frac{1}{u} = I(0<v<u)I(0<v<1)\frac{1}{u}$$.

The support of u, therefore, is $[v, 1]$ (since $v$ $>$ $0$ and $v$$<$$u$$<$$1$) in which, integrating the joint , you obtain $f_{V}(V)$ $=$-$log(v)$$I(0<v<1)$, which indeed assumes only positive values or $0$ out of the range.