What is correct : $n^{\mathcal{O}(1)} = \mathcal{O}(n^{logn})$ or $n^{\mathcal{O}(1)} = \mathcal{o}(n^{logn})$

118 Views Asked by At

I cannot really understand what is right: $$n^{\mathcal{O}(1)} = \mathcal{O}(n^{logn})$$ or $$n^{\mathcal{O}(1)} = \mathcal{o}(n^{logn})$$

I know that Little-oh notation means

For every choice of a constant k > 0, you can find a constant a such that the inequality f(x) < k g(x) holds for all x > a.

and Big-Oh

For at least one choice of a constant k > 0, you can find a constant a such that the inequality f(x) < k g(x) holds for all x > a.

But, it is not clear for me haiving $n^{\mathcal{O}(1)}$ and $(n^{logn})$ because whether $n^{\mathcal{O}(1)} < n^{logn}$ or $n^{\mathcal{O}(1)} > n^{logn}$ only matters on $n$ itself and the actual exponent of $n$.

Like $2^{1000} > 2^{log2}$ but as n grows $n^{logn}$ will become larger. Etc. So it may be that $$n^{\mathcal{O}(1)} < n^{logn}$$ or $$n^{\mathcal{O}(1)} > n^{logn}$$ or $$n^{\mathcal{O}(1)} = n^{logn}$$

But $n^{logn}$ grows asymptotically faster.

So, are we dealing with big or little oh?

1

There are 1 best solutions below

0
On

$n^{(O(1))}$ is actually both $O(n^{log(n)})$ and $o(n^{log(n)})$. Note that $o(n^{log(n)})$ is the stronger upper bound. In other words, $f(x) = o(n^{log(n)}) \implies f(x) = O(n^{log(n)})$.

Note that any function in $O(1)$ must be a constant. Denote that constant by $c$. We want to show that for any choice of $k > 0$, we can find a constant $a$ such that $n^c < k*n^{log(n)}$ for all $n > a$. Note that there exists a constant $e$ such that $n^e > 1/k$ for all k as long as $n>1$, since k is a constant. Consider selecting an $a$ such that $log(a) > c + e$. Then whenever $n > a$, $k*n^{log(n)} > k*n^{c+e} > n^c$. Thus, $n^{O(1)} = o(n^{log(n)})$, and it follows that $n^{O(1)} = O(n^{log(n)})$.

When proving these things, it is important to realize that you are trying to compare functions when the inputs get arbitrarily large. Sure, $2^{1000} > 2^{log(2)}$, but this does not say much about the relative growth rates of the function in general. The definitions of big-O and little-O seek to reach this sort of generalization when it comes to function growth comparisons.