The density of states of a system in an interval $[E, E+dE]$ is given implicity by $dV = D(E)dE$ (Or I suppose explicitly, by $D(E) = \frac {dV}{dE}$, but we'll be integrating it anyway, so it doesn't really matter).
Is there some way of stating this without using the infinitesimal in the interval? I hate when physicists throw infinitesimals around and usually can find a more mathematical statement, but what could I do here?
These infinitesimals (and many others that appears in physics) can be interpreted as finite differences, $\Delta E$ for instance. Thereby, the density obtained using $\Delta V=D(E)\Delta E$ would be an average density. To obtain the density you want, just take the limit $\Delta E\to 0$ of this equation.
Note that, infinitesimals and finite differences are not the same thing. But on a scientific point of view the error you get when using one instead of the other is acceptable since every experimental measure has an intrinsic error associated to it.
On the other hand, infinitesimals don't have a rigorous definition in standard analysis, although some people claim that they can be put in a rigorous setting by using non-standard analysis, but I don't know nothing about it. If you forget for a moment the concept of infinitesimal you learnt, differentials (not infinitesimals) are just linear functionals, or more generally, 1-forms. Therefore, I suggest you either go further and study non-standard analysis or just forget about the term "infinitely small".