Are these random variables independent, and can I use the transformation formula to transform their distribution this way?

97 Views Asked by At

Morning Stack Exchange,

I am working through a problem, and the problem is compute some change of variables for a particular distribution. I believe I have a reasonable approach to complete this computation, but I have some questions that could use some clearing up.

Let $X_1, X_2, X_3$ have joint distribution $f(x_1, x_2, x_3) = 6$ with support $0 < x_1 < x_2 < x_3 < 1$ and $0$ elsewhere.

My first question is:

Is it not true that the $X_i$ are dependent? There appears to be a clear dependency in the support of the joint PDF, but I have my friend in the class saying these $X_i$ are independent and we can start solving the problem from this point. I do not see how the $X_i$ are independent, but if someone could show me this or confirm my thoughts on it I would appreciate that.

The other half of my question is to ask if my method to compute the actual transformations of the variable works. We are asked to make the variable change $Y_1 = \frac{X_1}{X_2}, Y_2 = \frac{X_2}{X_3}, Y_3 = X_3$. My original thoughts were simply compute inverses for each of these transform and then compute $J$ determinant of the Jacobian matrix, but I question doing this because the transformations do not appear to be $1:1$. Namely, say for transform $Y_1$. $1/2$ is in image of $Y_1$, and we can generate $1/2$ with ordered pair $X_1 = 1/8, X_2 = 1/4$ or with $X_1 = 1/4, X_2 = 1/2$ and both of these inputs given $Y_1 = 1/2$, meaning that $Y_1$ is not $1:1$ so Jacobian transformation cannot be used.

In summation then, can I use Jacobian transformation formula to make the transformations, or should I try the CDF approach, because the transforms do not appear to be 1:1? Also, are the original $X_i$ independent?

1

There are 1 best solutions below

3
On BEST ANSWER

Your reasoning is correct. As you noted, the support is

$$ \bbox[5px,border:2px solid black] { 0<X_1<X_2<X_3<1 \qquad } $$

thus the rv's cannot be independent.

Just for sake of completeness, these are the marginal densities

$$f_{X_1}(x_1)=3(1-x_1)^2\mathbb{1}_{(0;1)}(x_1)$$

$$f_{X_2}(x_2)=6x_2(1-x_2)\mathbb{1}_{(0;1)}(x_2)$$

$$f_{X_3}(x_3)=3x_3^2\mathbb{1}_{(0;1)}(x_3)$$

$X_1\sim Beta(1;3)$

$X_2 \sim Beta(2;2)$

$X_3 \sim Beta(3;1)$

as you can see, the product of the 3 densities is not 6


Using the given transformation you get

$$ \begin{cases} y_1=\frac{x_1}{x_2} \\ y_2=\frac{x_2}{x_3} \\ y_3=x_3 \end{cases}$$

That becomes

$$ \begin{cases} x_1=y_1\cdot y_2\cdot y_3 \\ x_2= y_2\cdot y_3\\ x_3=y_3 \end{cases}$$

As usual calculate the jacobian, obtaining $|J|=y_2\cdot y_3^2$

After a little reasoning on the $Y_i$ supports, your joint density becomes (conveniently factorized)

$$ \bbox[5px,border:2px solid black] { f_{Y_1Y_2Y_3}(y_1,y_2,y_3)=\mathbb{1}_{(0;1)}(y_1)\cdot 2y_2\mathbb{1}_{(0;1)}(y_2)\cdot 3y_3^2\mathbb{1}_{(0;1)}(y_3) \qquad } $$

Which shows that $Y_1,Y_2,Y_3$ are independent with the following marginal laws:

$Y_1\sim U(0;1)$ (it can also be viewed as a $Beta(1;1)$)

$Y_2\sim Beta(2;1)$

$Y_3\sim Beta(3;1)$