I noticed that this problem was already posted on this page twice. But one was marked as a duplicate and the other was removed, so I hope this isn't breaking any rules.
I wish to solve the following problem:
The random variables $X_1$ and $X_2$ are independent and $\mathrm{N}(0,1)$-distributed. Set $$Y_1 = \frac{X_1^2-X_2^2}{\sqrt{X_1^2+X_2^2}}, \hspace{2mm} Y_2 = \frac{2X_1X_2}{\sqrt{X_1^2+X_2^2}}. $$ Show that $Y_1$ and $Y_2$ are independent, $\mathrm{N}(0,1)$-distributed random variables.
My current plan is to use the transformation theorem. Since I have $$f_{X_1,X_2}(x_1,x_2) = \frac{1}{2\pi}\mathrm{exp}\{-\frac{1}{2}(x_1^2+x_2^2) \},$$ I thought I might be able to find $f_{Y_1,Y_2}(y_1,y_2)$ by computing $$f_{Y_1,Y_2}(y_1,y_2) = f_{X_1,X_2}(x_1(y_1,y_2),x_2(y_1,y_2))|\mathrm{J}|, $$
(where $x_1(y_1,y_2)$ means $x_1$ in terms of $y_1,y_2$).
The issue I have is finding the inverses, i.e. for example $x_1$ in terms of $y_1,y_2$. So that is my main reason for posting. I'm also not completely sure if this is the right approach. There is a later chapter in by book on something called quadratic forms but this problem isn't a part of that chapter. But any other alternative ideas for approaching this would be much appreciated.