Surface area of sphere using Dirac delta

2.1k Views Asked by At

This question is related to this one.

Suppose I want to calculate the surface area $S(R)$ of a sphere of radius $R$. I can express $S(R)$ as

$$S(R)=\int_{\mathbb{R}^3} \delta (\| \vec x \|-R) \ d \vec x$$

I would then obtain

$$S(R) = 4 \pi \int_0^\infty \delta(r-R) r^2 dr = 4 \pi R^2$$

which is the correct result.

But it seems to me that I could equivalently express $S(R)$ as

$$S(R)=\int_{\mathbb{R}^3} \delta (\| \vec x \|^2-R^2) \ d \vec x$$

which gives

$$S(R) = 4 \pi \int_0^\infty \delta(r^2-R^2) r^2 dr $$

From the property of composition of the delta with a function,

$$\delta(r^2-R^2)=\frac{\delta(r-R)+\delta(r+R)}{2R}$$

but since $r \geq 0$ I only have to consider the positive root, so that

$$S(R) = 4 \pi \int_0^\infty \frac{\delta(r-R)}{2R} r^2 dr = 2 \pi R$$

Why do I get two different results? Is something wrong with the second way of expressing $S(R)$?