Currently I'm reading a paper in probability theory. Often, the author uses the little-oh-symbol, that I am not really familiar with, for example $$ o_\varepsilon(n) \text{ for some } \varepsilon > 0 \text{ and } n\in\mathbb N. $$ My intuition tells me that this means for a function $f_\varepsilon(n)\in o_\varepsilon(n)$ that $$ \lim_{\varepsilon\to0} \frac{|f_\varepsilon(n)|}{n} = 0 $$ or something related. Could you maybe tell me what you believe this means? I am especially uncertain with the index $\varepsilon$. For example with my "definition" that means for $f_\varepsilon(n) = \varepsilon$ and $g(n)=n$ that $$ \frac{|f_\varepsilon(n)|}{g(n)} = \frac{\varepsilon}{n} \to 0 $$ for $\varepsilon \to 0$ (or should it be $n\to\infty$)?
Here is the link to the paper.
Typically this means that the unsubscripted version is true for any fixed value(s) of the subscripted variable(s). See, for example, p13 of these notes (this is for big-Oh, but the same concept applies).
Here $f_\epsilon(n)=o_\epsilon(n)$ would mean that if $\epsilon>0$ is kept fixed but $n\to \infty$ then $\frac{f_\epsilon(n)}{n}\to 0$. For example, $f_\epsilon(n)=\epsilon^{-1}\sqrt n$ satisfies this definition, but $f_\epsilon(n)=o(1)$ is not necessarily true if $\epsilon$ is allowed to depend on $n$.