integration in polar coordinates in $\mathbb{R}^d$

79 Views Asked by At

It is known from the Real Variables theory that there exists a unique Borel measure $d\sigma$ on the unit sphere $\mathbb S^{d-1}=\{x\in\mathbb{R}^d:|x|=1\}$ such that \begin{align}\tag{1}\label{eq:1} \int_{\mathbb{R}^d} g(x) \,dx = \int_0^\infty \int_{\mathbb S^{d-1}} g(ry) \, d\sigma(y) \, r^{d-1} dr. \end{align} My question is: how can one use the above formula to show that for $f \ge 0$ the following identity holds \begin{align*} \int_{\mathbb S^{d-1}} f(x) \, d\sigma(x ) = \int_{y \in \mathbb{R}^{d-1}, |y| \le 1} \Big( f\big(y , (1 - |y|^2)^{1/2} \big) + f\big(y , - (1 - |y|^2)^{1/2} \big) \Big) (1 - |y|^2)^{-1/2} \, dy. \end{align*}

I tried as follows: given $f:S^{d-1}\rightarrow[0,\infty)$, define $F:\mathbb{R}^d\rightarrow[0,\infty)$ by setting $$ F(x)=f(x/|x|)\chi_{|x|<1}(x). $$ Then we can plug $g=F$ to \eqref{eq:1} and see that the right-hand side of \eqref{eq:1} is then equal $\frac{1}{d}\int_{\mathbb S^{d-1}} f(y) \, d\sigma(y)$. Thus the problem is reduced to showing that $$ d\int_{|x|<1} f(x/|x|) \,dx=\int_{y \in \mathbb{R}^{d-1}, |y| \le 1} \Big( f\big(y , (1 - |y|^2)^{1/2} \big) + f\big(y , - (1 - |y|^2)^{1/2} \big) \Big) \frac{dy}{(1 - |y|^2)^{1/2}} $$ I guess that at this point one needs to change variables suitably, but I couldn't quite get this computation right.

Any hints will be highly appreciated!