Using the formal definition of big O notation, would f(x) not be O(g(x)) for nearly all functions of g(x)?

224 Views Asked by At

The formal definition of big O notation states

$$f(x)=O(g(x)) \text{ as } x \rightarrow \inf \leftrightarrow |f(x)| \leq Mg(x) \text{ for all } x \geq x_0 $$

But would this not mean, for example, when $f(x) = 2x^2$, $f(x) = O(x!)$?

Setting $M=1$ and $x_0 = 5$, shows that $2x^2 \leq x!,\text{ } x \geq 5$.

The problem with this is that it doesn't sound right, and also doesn't follow the set of rules suggested here that is used to find the big O of a certain function.

My question is whether this is correct, which would mean for any function there are infinite big O functions they are, or if I'm doing something wrong.

1

There are 1 best solutions below

0
On BEST ANSWER

Your understanding is correct. For instance if $f \in O(x^n)$ then $f \in O(x^{n+m})$ as well.

Usually when one is asked to show $f \in O(g)$ for some $g$, you want $g$ to grow as slowly as possible. Saying $f \in O(x!)$ is not really helpful because that class is humongous.

There is no "unique" or "best" big-O class for any given $f$, but one usually tries to find the best among $O(g)$ where $g$ is among monomials, logarithms, exponentials, etc.