Suppose I have two random variables $X_1$ and $X_2$ with a joint exponential pdf, say $$ f(x_1,x_2)=abK\cdot \exp\left\{-(ax_1+bx_2+cx_1 x_2 )\right\}, $$ for $x_1>0$ and $x_2>0$, and for some $a,b,c>0$, and $K$ is a normalizing constant.
I want to construct a random vector $Y=(Y_1,Y_2)^T$ by some transformation $$ Y=AX,\text{ where }X=(X_1,X_2)^T, $$ such that $Y_1$ and $Y_2$ are now independent, (but not necessarily exponential).
What should be the components of (the square matrix) $A$ here, if this is possible?
Can anyone also recommend an article or a book that might have discussed my problem or anything similar to this?
You don't even need two variables.
First, transform $X_1$ to uniform distribution on $[0; 1]$ via some function $f$. Now let $\gamma: [0; 1] \to [0; 1]^2$ be Hilbert version of Peano curve, and $\gamma_1$ and $\gamma_2$ been it's $x$ and $y$ coordinates. Then $Y_1 = \gamma_1(f(X_1))$ and $Y_2 = \gamma_2(f(X_1))$ are independent (see "Stochastic Independence and Space-Filling Curves" by John A. R. Holbrook for proof; the main idea is that probability of getting into any square with side $2^{-k}$ and angle with coordinates $(n \cdot 2^{-k}, m\cdot 2^{-k})$ is $4^{-k}$ starting from the $k$-th iteration).