Let $M$ and $N$ are independent random variables distributed Uniform$[0, 1]$.
Define $(M_n)_{n\geq 1}$ and $(N_n)_{n\geq 1}$ which are two independent sequences of iid random variables distributed uniformly over $[−1, 1]$. Let $Z = \inf \{ n ≥ 1, 0 < M^2_n + N^2_n< 1 \}$ and
$X = M_Z \sqrt{\frac{-2 \log (M_Z^2 + N_Z^2)}{M_Z^2 + N_Z^2}}~~$ and $Y = N_Z \sqrt{\frac{-2 \log (M_Z^2 + N_Z^2)}{M_Z^2 + N_Z^2}}$
What is the distribution of $Z$? show that $X$ and $Y$ are two independant random variables distributed $N(0,1)$.
Addition A previous required question related to this exercise is asking to show that $X$ and $Y$ are independent random variables distributed $N(0, 1)$, knowing that:
$ X = \sqrt{-2 \log(M)} \cos(2 \pi N)$ and $Y = \sqrt{-2 \log(M)} \sin(2 \pi N)$
I already proved this part using the change of variables transformation. Then to show that they are independent, the joint density of $X,Y$ can factor into separate densities of $X$ and $Y$.
$M, N$ are uniformly distributed on the unit disk, so $f_{M,N}(m,n) = \pi^{-1}$ in the unit disk and zero outside. We have new variables $X,Y$ expressed in terms of $M,N.$ Recall the Change of Variables theorem, which states that if we have some bijective function $g : (x,y) \ \mapsto \ (m, n)$ with continuous partial derivatives, then the joint density of $X,Y$ is given by
$$f_{X,Y}(x,y) = f_{M,N}( g_1(x,y), g_2(x,y) ) \cdot |J_g|$$
where $J_g$ is the determinant of the Jacobian matrix of $g,$ and $|J_g|$ is the absolute value of that.
For this specific question, rearranging the defining equations for $X,Y,$ gives $$g(x,y) = h(x,y)\begin{pmatrix}x\\y\end{pmatrix}, \ \ \text{ where }\ h(x,y) = \frac{\exp \left(-\frac{x^2+y^2}{4}\right)}{\sqrt{x^2+y^2}}$$
One could start routinely calculating the partial derivatives and simplifying the resulting terms to compute $J_g,$ but the computation is considerably easier if we continue with the special form that $g$ has. \begin{align} J_g &= \frac{ \partial g_1}{\partial x} \frac{ \partial g_2}{\partial y} - \frac{\partial g_1}{\partial y} \frac{\partial g_2}{\partial x}\\ &=\left(h + x \frac{\partial h}{\partial x}\right)\left(h + y \frac{\partial h}{\partial y}\right) -\left(x \frac{\partial h}{\partial y}\right)\left(y \frac{\partial h}{\partial x}\right)\\ &=h^2 + \left( x \frac{\partial h}{\partial x} + y \frac{\partial h}{\partial y}\right)h\\ &= h^2 \left( 1 + \bigg\langle \nabla \log h \ , \begin{pmatrix}x\\y\end{pmatrix} \bigg\rangle \right) \end{align}
We have $\displaystyle\nabla \log h(x,y) = \left( \frac{-1}{x^2+y^2} - \frac{1}{2} \right) \begin{pmatrix}x\\y\end{pmatrix},$ so $\displaystyle 1 + \bigg\langle \nabla \log h \ , \begin{pmatrix}x\\y\end{pmatrix} \bigg\rangle = - \frac{x^2+y^2}{2}.$
Thus, $$J_g = - \frac{1}{2} \exp \left( - \frac{x^2+y^2}{2}\right)$$
and $$f_{X,Y}(x,y) = \frac{1}{2\pi} \exp \left( - \frac{x^2+y^2}{2}\right)$$
Therefore, the joint distribution of $X$ and $Y$ is that of the standard bivariate normal distribution, which implies that $X$ and $Y$ are independent identically distributed standard normal variables.