Which one is asymptotically less ?$n^\varepsilon$ or $\lg(n)$

1.5k Views Asked by At

First of all let me give you the convention $\lg$ is the logarithm with base $2$.

Now, I'm studying from the book Cormen's Introduction to algorithm (3rd edition) book. In this book, at page 96, It is written that;

$\lg(n)$ is asymptotically less than $n^\varepsilon$ for any positive constant $\varepsilon$.

But I have difficulty to understand this. First of all, I want to be sure with the meaning of "asymptotically less". I think its mean that, for large values of $n$ one of them always has less values than the other, right?

If I'm right, how could I understand that, $\lg(n)$ is asymptotically less than $n^\varepsilon$ ?

Thanks in advance.

3

There are 3 best solutions below

11
On BEST ANSWER

In Asymptotic Analysis, $f\sim g$ means $\lim\limits_{n\to\infty}\frac{f(n)}{g(n)}=1$, and $f\lesssim g$ means $\limsup\limits_{n\to\infty}\frac{f(n)}{g(n)}\le1$. Thus, asymptotically less than would mean $$ \limsup_{n\to\infty}\frac{f(n)}{g(n)}\lt1\tag{1} $$ However, in Algorithmic Analysis, if $$ 0\lt c\le\liminf\limits_{n\to\infty}\frac{f(n)}{g(n)}\le\limsup\limits_{n\to\infty}\frac{f(n)}{g(n)}\le C\lt\infty $$ for some constants $c$ and $C$, $f$ and $g$ are said to be of the same order.

So to be more precise, I believe you want it to be shown that $\log_2(n)=o\left(n^\varepsilon\right)$, which means $$ \lim_{n\to\infty}\frac{\log_2(n)}{n^\varepsilon}=0\tag{2} $$ Since $(2)$ implies $(1)$, we will show that $(2)$ holds.


Since $1+x\le e^x$, we have $1+\log(x)\le x$. Therefore, $$ \tfrac\varepsilon2\log(n)\lt n^{\varepsilon/2} $$ thus, $$ \frac{\log_2(n)}{n^\varepsilon}\lt\frac2{\varepsilon\log(2)}n^{-\varepsilon/2} $$ Since the right side vanishes as $n\to\infty$, we get that $\log_2(n)=o\left(n^\varepsilon\right)$ for any $\varepsilon\gt0$.

4
On

As usual: study the difference (from its derivative).

To compare which of $f(t)$ and $g(t)$ is greatest, you just have to study the sign of $h = f-g$. Now if $h(t)$ is monotonous (i.e. going always up, or always down) at least after some $t_0$, then you can predict the sign at infinity.

5
On

This means that for any given constant $\epsilon $ , you can always find constants, $n_0 >0$, $c > 0$ such that :

$lg(n) < c n^\epsilon $ for all $n\geq n_0$.