Please define differentials rigorously such that they give a consistency to their use in the following links.
I have read
- Is $\frac{\textrm{d}y}{\textrm{d}x}$ not a ratio?
- What is the practical difference between a differential and a derivative?
- Differential of a function at Wikipedia.
- If $\frac{dy}{dt}dt$ doesn't cancel, then what do you call it?
- Leibniz's notation at Wikipedia.
- Exact differential equation at Wikipedia.
- Moment of inertia at Wikipedia.
- Center of mass at Wikipedia, etc.
Differentials nowadays have a canonical definition which is used everyday in differential geometry and differential topology, or in mathematical physics. They are grounded on linear (resp., multilinear) algebra and on the notion of $d$-dimensional real or complex manifold.
These differentials have nothing to do with the "infinitesimals" of nonstandard analysis, nor is the latter theory of any help in understanding and using them.
Not every time you see a $d$ in a formula a differential is at work. In the sources you quote the $d$ rather tries to convey the intuition of "a little bit of", e.g., $d\,V$ means: "a little bit of volume".
So when you see an expression like $$\int\nolimits_B (x^2+y^2)\ dV$$ this typographical picture encodes the result of a long thought process, and you should not think of $dV$ as a clear cut mathematical entity. This thought process is the following: You are given a three-dimensional body $B$ (a "top") that is going to be rotated around the $z$-axis. Physical considerations tell you that the "rotational inertia" $\Theta$ of this body can be found by partitioning it into $N\gg1$ very small pieces $B_k$, choosing a point $(\xi_k,\eta_k,\zeta_k)$ in each $B_k$ and forming the sum $$R:=\sum_{k=1}^N(\xi_k^2+\eta_k^2){\rm vol}(B_k)\ .$$ The "true" $\Theta$ would then be the limit of such sums, when the diameters ${\rm diam}(B_k)$ go to zero.
Similarly, when you have a plane curve $\gamma:\ s\mapsto {\bf z}(s)=\bigl(x(s),y(s)\bigr)$, it's bending energy $J$ is given by the integral $$J:=\int\nolimits_\gamma \kappa^2(s)\ ds\ ,$$ where $\kappa$ denotes the curvature. Don't think here of the precise logical meaning of $ds$, but of the intended thought process: The curve is cut up into $N$ pieces of length $\Delta s_k>0$, and the curvature of $\gamma$ is measured at a point ${\bf z}(\sigma_k)$ of each piece. Then one forms the sum $$R:=\sum_{k=1}^N \kappa^2(\sigma_k)\>\Delta s_k\ ;$$ and finally the "true" $J$ is the limit of such sums when the $\Delta s_k$ go to zero.
Now comes the question of "piece of area" vs. "piece of length". This question teaches us that we have to be careful when dealing with "little bits of something". Consider the following figure:
The "area under the curve" $\gamma$ corresponding to a certain $\Delta x>0$ is roughly $f(\xi)\cdot \Delta x$, independently of the exact slope of the curve at $\xi$. Making $\Delta x$ smaller will decrease the relative area error committed here. But the length $\Delta s$ of the short arc corresponding to $\Delta x$ is roughly $={\Delta x\over\cos\phi}$, and making $\Delta x$ smaller does not bring away the factor ${1\over\cos\phi}$. It follows that the final formula for the total length will have to incorporate the values ${1\over\cos\phi}=\sqrt{1+f'(\xi)^2}$.