In the Introduction to Algorithms textbook by Cormen, Leiserson, Rivest and Stein, it says that "When $a>0$, any linear function $an+b$ is in $O(n^2)$, which can be easily verified by taking $c = a +|b|$ and $n_0 = \text{max}(1, -b/a)$. I understand that any linear function is in $O(n^2)$. But I don't understand how they come up with the $c$ and $n_0$.
In this CLRS textbook, the definition of big-Oh notation is:
$O(g(n)) = \{f(n): \text{there exist positive constants } c \text{ and } n_0 \text{ such that } 0\le f(n)\le cg(n) \text{ for all } n\ge n_0\}$
Question: How can I derive the expressions for $c$ and $n_0$?
Thank you!
You can think of $O(n^2)$ as a set of functions. It is the set of functions $f$ for which there is a constant $c$ such that $f(n)\le cn^2$ for every $n$ large enough. In other words, $O(n^2)$ is the set of functions that asymptotically grow at most as fast as a quadratic.
Similarly, $O(n)$ is the set of functions $f$ for which there is a constant $c$ such that $f(n)\le cn$ for every $n$ large enough. However, from this you can see that any function that is $O(n)$ is also $O(n^2)$. So in other words, if a function $f$ is $O(n)$, then $f$ is also $O(n^2)$.
In fact, if $f$ is $O(n^k)$, then $f$ is also $O(n^l)$ for any $l \geq k$.