First of all let me give you the convention $\lg$ is the logarithm with base $2$.
Now, I'm studying from the book Cormen's Introduction to algorithm (3rd edition) book. In this book, at page 96, It is written that;
$\lg(n)$ is asymptotically less than $n^\varepsilon$ for any positive constant $\varepsilon$.
But I have difficulty to understand this. First of all, I want to be sure with the meaning of "asymptotically less". I think its mean that, for large values of $n$ one of them always has less values than the other, right?
If I'm right, how could I understand that, $\lg(n)$ is asymptotically less than $n^\varepsilon$ ?
Thanks in advance.
In Asymptotic Analysis, $f\sim g$ means $\lim\limits_{n\to\infty}\frac{f(n)}{g(n)}=1$, and $f\lesssim g$ means $\limsup\limits_{n\to\infty}\frac{f(n)}{g(n)}\le1$. Thus, asymptotically less than would mean $$ \limsup_{n\to\infty}\frac{f(n)}{g(n)}\lt1\tag{1} $$ However, in Algorithmic Analysis, if $$ 0\lt c\le\liminf\limits_{n\to\infty}\frac{f(n)}{g(n)}\le\limsup\limits_{n\to\infty}\frac{f(n)}{g(n)}\le C\lt\infty $$ for some constants $c$ and $C$, $f$ and $g$ are said to be of the same order.
So to be more precise, I believe you want it to be shown that $\log_2(n)=o\left(n^\varepsilon\right)$, which means $$ \lim_{n\to\infty}\frac{\log_2(n)}{n^\varepsilon}=0\tag{2} $$ Since $(2)$ implies $(1)$, we will show that $(2)$ holds.
Since $1+x\le e^x$, we have $1+\log(x)\le x$. Therefore, $$ \tfrac\varepsilon2\log(n)\lt n^{\varepsilon/2} $$ thus, $$ \frac{\log_2(n)}{n^\varepsilon}\lt\frac2{\varepsilon\log(2)}n^{-\varepsilon/2} $$ Since the right side vanishes as $n\to\infty$, we get that $\log_2(n)=o\left(n^\varepsilon\right)$ for any $\varepsilon\gt0$.