Why is $f(n)= n^{1.01}$; $g(n)=n(\log ⁡n)^2$ ultimately $f = \Omega(g)$?

1.2k Views Asked by At

My objective is to determine whether the following equations are $f = O(g)$, $f = \Omega(g)$, or both (i.e., $f = \Theta(g)$).src)

(a) $f(n) = n^{1.01}$; $g(n) = n(\log n)^2$

(b) $f(n) = \frac{n^2}{\log n}$; $g(n) = n(\log n)^2$

My understanding is that $f = O(g)$ when he running time of the algorithm is at most proportional to $g(n)$, whereas $f = \Omega(g)$ when the running time of the algorithm is proportional to $g(n)$.

I tried to graph both equations for part (a) in WolframAlpha but the growth of these equations seems to be the opposite of the answer sheet I found here (#7), which claims the growth to be $f = \Omega(g)$. Doing a few google searches, I found a few answers to this question using limits, however, my familiarity with limits is limited (no pun intended), and I am trying to understand the solution through other (simpler) means.

Can anyone please elaborate on what the correct answer to this function is, and how you arrived at that determination either graphically via WolframAlpha or some other means.

Thank you!

2

There are 2 best solutions below

0
On

The easiest way to see this is via a substitution $u^{100} = x$. We get $f(u)=u^{101}$ and $g(u)=u^{100}\log(u^{100})^2 \approx u^{100} \log(u)^2$. Then, $f/g = u/\log(u)^2$ which clearly tends to infinity in the positive direction, since $u$ dominates $\log u$.

$f$ is the dominant one in both examples. Take f/g in example two and you get similarly $x/\log(x)^3$ again $x$ dominates $\log^3 x$ here.

0
On

So, actually:

  • $f=O(g)$ means that there exists a constant $M$ so that $f(x) \leq M\cdot g(x)$ for all $x>x_0$; so $f$ is at most the same complexity than $g$
  • on the contrary, $f=\Omega(g)$ means that there exists a constant $m$ so that $f(x)\geq m \cdot g(x)$ for all $x>x_0$; so $f$ is at least the same complexity than $g$
  • finally, $f=\Theta(g)$ if both are true

In the definition $x>x_0$ can be interpreted as "the condition is true for large enough $x$'s", so usually the way to go is to look for the limit of $f/g$ when $x$ goes to $+\infty$.

Roughly, $f=O(g)$ iff $\lim f/g <\infty$; and $f=\Omega(g)$ iff $\lim f/g >0$ - in both cases, supposing $g(x)>0$ for large enough $x$. You can see more here: https://en.wikipedia.org/wiki/Big_O_notation

Now, for your specific problem,

$$\frac{f(x)}{g(x)}=\frac{x^{0.01}}{(\log(x))^2}$$

It is quite known that $\lim \frac{x^a}{(\log(x))^b} = \infty$ as long as $a>0$ and $b\in\mathbb{R}$. In french, we call that the theorem of compared growths (https://fr.wikipedia.org/wiki/Th%C3%A9or%C3%A8me_des_croissances_compar%C3%A9es).