I'm having serious problems to understand what people mean when they write $o(\epsilon_\text{mach})$, where $\epsilon_\text{mach}$ stands for the machine epsilon. I'm seeing this in backward analysis for some finite-precision algorithms.
As I understand, the little-oh means that $f(x)\in o(g(x)) \iff \lim_{|x|\to0}\frac{|f(x)|}{|g(x)|} = 0$. The problem is that $\epsilon_\text{mach}$ is constant, so $f(x)\in\epsilon_\text{mach}$ means $\lim_{|x|\to0}\frac{|f(x)|}{\epsilon_\text{mach}} = 0$, therefore, $\lim_{|x|\to0}|f(x)| = 0$. From this we have that $f(x)\in o(1)$. So, why bother mentioning $\epsilon_\text{mach}$ in the first place?
Of course I suspect my conclusions are wrong because I probably didn't understand something. I need your help to understand this. Thank you.
The machine epsilon $\epsilon_\text{mach}$ is not meant to be considered a constant, but as a variable. In fact, the idea is to measure how fast $f(\epsilon_\text{mach})$ goes to zero when the precision goes to infinity (which means $\epsilon_\text{mach} \to 0$).
Therefore, $$f\in o(\epsilon_\text{mach}) \iff \lim_{\epsilon_\text{mach}\to 0}\frac{f(\epsilon_\text{mach})}{\epsilon_\text{mach}} = 0.$$