In doing complexity analysis of algorithms, we often restrict to their asymptotic complexity. A function $g: \mathbb N \to \mathbb R_+$ is asymptotically bounded by $f: \mathbb N \to \mathbb R_+$ if there are constants $c \in \mathbb R_+, N \in \mathbb N$ such that for each $n \geq N$, $$ g(n) \leq c\cdot f(n). $$ The set of all such $g$ is referred to as $\mathcal O(f)$. By letting $g \preceq f$ if $\mathcal O(g) \subseteq \mathcal O(f)$, we obtain a preordering on the set of functions $\mathbb N \to \mathbb R_+$.
Examples of functions that typically come up as complexities of algorithms are: $$\tag{*} \log(\log(n)), \log(n),\log(n)^2, \sqrt{n}, n, n\log(n),n\sqrt{n},n^3, 2^n, 3^n, 2^{2^n}. $$ Interestingly, these functions are all linearly ordered by $\preceq$. Note that it is not hard to find functions which are not linearly ordered by $\preceq$, like $|\sin(n)|$ and $|\cos(n)|$, and with some fiddling it would not be too hard to make examples of strictly increasing (or even convex) functions which are not linearly ordered by $\preceq$. However, these examples all seem artificial, and in particular I would be surprised if they ever came up as complexities of algorithms.
Can we explicitly give a nice class of functions that includes at least all the functions listed in (*), and for which $\preceq$ is a linear ordering? Is this class also closed under some operations, like multiplication, exponentiation by a constant, or taking logarithms? And, perhaps, could we even justify why any algorithm is likely to have a complexity in this class?
The set of functions created from constants, field operations and allowing exp and log is linearly ordered by big-O. They are called exp-log functions. Take a look at Hardy's "Orders of Infinity", page 24. For more about this, search for "Hardy fields".
This class covers most "typical" complexities. However, the world of complexity is much richer than any simple description that can be written down. Complexity can be: