This is an excerpt from a textbook I am reading:
A number of useful shortcuts can be applied when using asymptotic notation. First:
$O(n^{c_1}) \subset O(n^{c_2})$
for any $c_1 < c_2.$ Second: For any constants $a,b,c>0,$
$O(a) \subset O(\log n) \subset O(n^b) \subset O(c^n)$
These inclusion relations can be multiplied by any positive value, and they still hold. For example, multiplying by n yields:
$O(n) \subset O(n \log n) \subset O(n^{1+b}) \subset O(nC^n)$
I'm wondering how $O(\log n)$ is a subset of $O(n^b)$
Also how are functions subsets of other functions?
Suppose that $f \in O(\log{n})$. Then there is a constant $C > 0$ for which $f(n) \le C \log{n}$ for all sufficiently large $n$.
On the other hand, for every $\alpha > 0$ we have $\log{n} < n^\alpha$ for large enough $n$; this can be seen readily enough by looking at the graphs of $\log{n}$ and $n^{\alpha}$, or by observing that the function $e^x$ grows faster than $x^n$ for every $n$.
Hence, for large $n$, we've got $f(n) \le C \log{n} < C n^{\alpha}$ so that $f \in O(n^{\alpha})$ as desired.