Suppose that $v\sim N(0,\sigma^2 I_n)$ and with $||\cdot||$ denoting the Euclidean norm, define $$ u=v/||v||\quad\text{and}\quad w=||v||. $$ I've been told that $u$ and $w$ are independent and I see an argument here on MO but I can't really follow it because I don't see the justification behind the step that states the joint distribution function of $u$ and $w$ (in the link, $f_{u,y}(u,y)$). Can someone please explain that step or, better, provide a more elementary proof of the independence of $u$ and $w$?
Argument for $n=1$ case. We have, for $w_0\geq 0$, \begin{align*} \Pr[(w\leq w_0) \cap (u=+1)]&=\Pr[0<v\leq w_0]=\Pr[w\leq w_0]\underbrace{\Pr[u=+1]}_{1/2},\\ \Pr[(w\leq w_0) \cap (u=-1)]&=\Pr[-w_0\leq v< 0]=\Pr[w\leq w_0]\underbrace{\Pr[u=-1]}_{1/2} \end{align*} from which independence follows.
Multivariate standard normal density function is in the form $f(x)=\exp(-\|x\|^2)$ where $x$ is the vector. Observe that $f(x)$ doesn't care about $\frac{x}{\|x\|}$ and only cares $\|x\|$, hence conditioned on any slice $\|x\|=a$, $f(x)$ is a constant which means all directions are equally-likely (i.e. uniform distribution).
Now, the conditional distribution of direction is independent of $\|x\|$ and is always uniform e.g. \begin{equation} P(\frac{x}{\|x\|}\big|\|x\|=a)=constant. \end{equation} Since all marginals are uniform, the overall distribution $P(\frac{x}{\|x\|})$ is uniform as well hence \begin{equation} P(\frac{x}{\|x\|}\big|\|x\|)=P(\frac{x}{\|x\|}) \end{equation} But this is the very definition of independence as it implies (via Bayes rule) \begin{equation} P(\frac{x}{\|x\|},\|x\|)=P(\frac{x}{\|x\|}\big|\|x\|)P(\|x\|)=P(\frac{x}{\|x\|})P(\|x\|) \end{equation}