Let the problem we're studying be $f : X \to Y$.
Say, I don't know what I want to define time-complexity with respect to, I just know I have a map $|\cdot| : X \to \Bbb{R}$, such that $|\cdot| \geq 0$. This is usually the "size" of the input.
And we have a running time function for an algorithm of $f$ on some machine, $T : X \to \Bbb{\Bbb{R}}$. Then it's not immediately clear to me how we do Big-O analysis on $T$. $f = O(g) \iff \exists (x_0, K) : \forall x \geq x_0, f(x) \leq K g(x)$ or $f$ is eventually dominated by a constant times $g$. What's the best way to express that using an arbitrary set $X$ as Big-O assumes $f : \Bbb{R} \to \Bbb{R}$?
Thanks.
I think we should define an input measuring function as $|\cdot|$ and our associated running time as $T(|\cdot|) : X \to \Bbb{R}$ that way $T : \Bbb{R} \to \Bbb{R}$.