This question concerns elementary problem solving in proability theory. I have two random variables
$$ \begin{align} X_{1}&=\sqrt{-2\ln{(U_{1})}}\cos{(2\pi U_{2})}\\ X_{2}&=\sqrt{-2\ln{(U_{1}})}\sin{(2\pi U_{2})} \end{align} $$
where $U_{1}$ and $U_{2}$ are independent uniformly distributed random variables in the unit interval. I would like to show that both $X_{1}$ and $X_{2}$ are $N(0,1)$-distributed and independent. I have made an attempt and my solution so far is that the inversion of the transformation is given by
$$ \begin{align} X_{1}^{2}+X_{2}^{2}&=-2\ln{(U_{1})}[\cos^{2}{(2\pi U_{2})}+\sin^{2}{(2\pi U_{2})}]=-2\ln{(U_{1})}\implies U_{1}=\exp{\left(-\frac{X_{1}^{2}+X_{2}^{2}}{2}\right)},\\ \frac{X_{2}}{X_{1}}&=\frac{\sqrt{-2\ln{(U_{1}})}\sin{(2\pi U_{2})}}{\sqrt{-2\ln{(U_{1})}}\cos{(2\pi U_{2})}}=\tan{(2\pi U_{2})} \implies U_{2}=\frac{\arctan{(X_{2}/X_{1})}}{2\pi}. \end{align} $$
Now, in order to finally obtain the distribution of $X_{1}$ and $X_{2}$ I need to use the transformation theorem, hence I need the Jacobian,
$$ \mathbf{J}(x_{1},x_{2}) = \begin{pmatrix}\frac{\partial}{\partial x_{1}}h_{1}(x_{1},x_{2})&\frac{\partial}{\partial x_{2}}h_{1}(x_{1},x_{2})\\\frac{\partial}{\partial x_{1}}h_{2}(x_{1},x_{2})&\frac{\partial}{\partial x_{2}}h_{2}(x_{1},x_{2})\end{pmatrix} $$
where $h_{1}(x_{1},x_{2})=\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)}$ and $h_{2}(x_{1},x_{2})=\frac{\arctan{(x_{2}/x_{1})}}{2\pi}$. I obtain the partial derivatives
$$ \begin{align} \frac{\partial}{\partial x_{1}}h_{1}(x_{1},x_{2})&=-x_{1}\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)},& \frac{\partial}{\partial x_{2}}h_{1}(x_{1},x_{2})&=-x_{2}\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)},\\ \frac{\partial}{\partial x_{1}}h_{2}(x_{1},x_{2})&=-\frac{1}{2\pi}\frac{1}{1+\left(\frac{x_{2}}{x_{1}}\right)^{2}}\frac{x_{2}}{x_{1}^{2}},& \frac{\partial}{\partial x_{2}}h_{2}(x_{1},x_{2})&=\frac{1}{2\pi}\frac{1}{1+\left(\frac{x_{2}}{x_{1}}\right)^{2}}\frac{1}{x_{1}}. \end{align} $$
The determinant of $\mathbf{J}$ thus becomes
$$ \begin{align} \det{(\mathbf{J}(x_{1},x_{2}))} &= -\frac{1}{2\pi}\frac{1}{1+\left(\frac{x_{2}}{x_{1}}\right)^{2}}\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)} - \frac{1}{2\pi}\frac{1}{1+\left(\frac{x_{2}}{x_{1}}\right)^{2}}\left(\frac{x_{2}}{x_{1}}\right)^{2}\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)}\\ &=-\frac{1}{2\pi}\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)}. \end{align} $$
The joint density of $X_{1}$ and $X_{2}$ is
$$ \begin{align} f_{X_{1},X_{2}}(x_{1},x_{2}) &= f_{U_{1},U_{2}}(h_{1},h_{2})\cdot\vert\det{(\mathbf{J})}\vert=\{\text{Independence}\}=1^{2}\cdot\frac{1}{2\pi}\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)}\\ &= \frac{1}{2\pi}\exp{\left(-\frac{x_{1}^{2}+x_{2}^{2}}{2}\right)}=\frac{1}{1\cdot\sqrt{2\pi}}\text{e}^{{-\frac{1}{2}\left(\frac{x_{1}-0}{1}\right)^{2}}}\cdot\frac{1}{1\cdot\sqrt{2\pi}}\text{e}^{{-\frac{1}{2}\left(\frac{x_{2}-0}{1}\right)^{2}}}. \end{align} $$
Since the joint density factorizes into two univariate normal distributions this implies that $X_{1}$ and $X_{2}$ are independent and $\sim N(0,1)$.
Note: I believe I happened to solve the problem I was stuck on while writing it all out. If anyone sees anyhting wrong with this solution let me know.