Why is the Jacobian of this random vector transformation the determinant of the fixed matrix?

186 Views Asked by At

Here's a link: https://www.probabilitycourse.com/chapter6/6_1_5_random_vectors.php. My question concerns example 6.15

Here is the stated problem:

Let $\mathbf{X}$ be an $n$-dimensional random vector. Let $\mathbf{A}$ be a fixed (non-random) invertible $n \times n$ matrix and $\mathbf{b}$ be a fixed $n$-dimensional vector. Define the random vector $\mathbf{Y}$ as $\mathbf{Y} = A \mathbf{X} + \mathbf{b}$. Find the pdf of $\mathbf{Y}$ in terms of the pdf of $\mathbf{X}$

In his answer, he states that the jacobian for the inverse transformation $\mathbf{H(Y)}$ is simply $\det{A^{-1}}$. Why?

1

There are 1 best solutions below

0
On

Considering a simpler case may be helpful. Assume a scalar random variable $x$ and take $b=0$. We have then

$$ y=Ax $$

where $A$ is also just a scalar. The inverse transformation is then $x=\frac{1}{A}y = A^{-1}y$.

The in this case, definition of the Jacobian is $\frac{\partial{x}}{\partial{y}}=A^{-1}$. This is also equal to the determinant for this simple scalar case.

I believe this approach can be generalized to the vector random variable.

I hope this helps.