Is there a standard method to finding out if functions of multiple random variables are independent? For example, if I have random variables $X$ and $Y$ that are independent exponential RVs with parameter $\lambda$ and $\mu$ respectively, the joint PDF of the two is:
$$f_{XY}(X,Y) = e^{(-\lambda x-\mu y)}$$ when $x$ and $y$ are both greater than 0 and the PDF equals zero otherwise .
Now if I have two functions of $X$ and $Y$ that are
$$ A = X+Y $$ $$ B = X-Y $$
How could I go about showing whether or not $A$ and $B$ are independent? I know how to solve the marginal PDFs of $X$ and $Y$ as well as the CDF/PDF of $A$ and $B$ (using Law of Total Probability and Substitution Law), but I'm unsure how to determine if $A$ and $B$ are independent.
And beyond the example, is a there a general way to go about determining if functions of the same random variables are independent, regardless of the type of variables that are function inputs?
There is a standard way to do a change of variables transformation from which you should be able to obtain $$ f_{A,B}(a,b) = \frac{1}{2}e^{-\frac{1}{2}((\lambda+\mu)a+(\lambda-\mu)b)}$$ for $a>0,$ $|b|<a.$ Even though the density superficially seems to factor, notice that the bounds are dependent on one another (i.e. the support is not a rectangle) which means that they are dependent.
In fact, we could have seen this dependence without doing the change of variables at all. Notice we have $|B|<A,$ so the value of $A$ changes the support of $B$'s conditional distribution.