Assume $f(r)$ is a non-increasing function of $r$ and consider the limit $$\lim_{r \rightarrow 0^+} \frac{\log{f(r)}}{\log{\frac{1}{r}}}$$ Something I'm reading states:
Since $f(r)$ is a non-increasing function of $r$ and since $\log{\frac{1}{r}}$ changes very slowly, instead of the limit along all values of $r$ we can take the limit along any sequence $r_j$ such that $$\frac{\log{\frac{1}{r_j + 1}}}{\log{\frac{1}{r_j}}} \rightarrow 1~~~~~\text{as}~~~~~ j \rightarrow \infty$$
I don't quite understand why any of this is true. I assume that there's some hand-waving going on here, and I also assume that it's happening because I lack pre-requisites.
When I read "... since $\log{\frac{1}{r}}$ changes very slowly..." I think its derivative must be relatively "small." But it's derivative is $-\frac{1}{r}$, whose magnitude gets infinitely high as $r \rightarrow 0$, so I must be mistaken here. What does it mean for $\log{\frac{1}{r}}$ to "change very slowly"?
Why can we use such a sequence in the limit? Is there something specific about that sequence (that may be relevant later in the text, or that is covered in some review of a common trick) or is there some general principle here that I'm missing? I have an idea as to when one can use a sequence to evaluate a limit, but the condition for the sequence $r_j$ doesn't fit that intuition—I could understand something more naive, like some sequence where $r_j \rightarrow 0$ as $j \rightarrow \infty$, but I don't think that's equivalent to the condition given (when $r_j \rightarrow 0$, $\log{\frac{1}{r_j + 1}} \rightarrow 1$ and $\log{\frac{1}{r_j}} \rightarrow \infty$, so dividing these probably doesn't go to $1$).
If this is a matter of me lacking some prerequisite and not just being dense, I'd also appreciate any textbook or lecture recommendations. (For context, this is regarding the Minkowski dimension; I've replaced the number of balls necessary to cover some set with $f(r)$.)
(Also, feel free to edit the title to something more descriptive)