Evaluating big-O vs big-Omega for two functions

88 Views Asked by At

We were tasked with comparing the complexities of two functions: $n$ and $n^{0.99} (\log(n))^2$.

As I understand it, the general construct of these equations is $\lim_{x \rightarrow \infty} \frac{f(n)}{g(n)}$. If the limit evaluates to 0, we have $f \in O(g)$. If the limit evaluates to infinity, $f \in \Omega(g)$. If it evaluates to a constant, then $f \in \Theta(g)$.

If we plot the quotient of these two functions, then we clearly see that the result approaches 0: Wolfram Alpha Plot

However, if we plug the limit into Wolfram Alpha, we get $\lim_{x \rightarrow \infty} \frac{x}{x^{0.99} (\log(x))^2} = \infty$

These two results seem to be contradictory. Interestingly enough, if we ask for $\lim_{x \rightarrow \infty} \frac{d}{dx}\frac{x}{x^{0.99} (\log(x))^2}$, Wolfram Alpha gives 0. I don't think it's possible for a limit to be infinity and the derivative's limit to be 0...

I must be missing something, but I can't figure out what. Any ideas?

Edit: Got it, graphs are useless in this scenario. However, suppose I had to do this calculation manually. How would I go about evaluating the limit?

3

There are 3 best solutions below

0
On BEST ANSWER

Two further points.

First, your definition of big-O notation is not quite right.

  • If the limit $\lim_{n \to \infty} \frac{f(n)}{g(n)}$ evaluates to $0$, then $f \in o(g)$; we have $f \in O(g)$ if the limit is either $0$ or a constant. In other words, $O(g) = o(g) \cup \Theta(g)$. Similarly, we write $f \in \omega(g)$ when the limit is $\infty$ and $\Omega(g) = \Theta(g) \cup \omega(g)$. A function in $\Theta(g)$ is in both $O(g)$ and $\Omega(g)$, but isn't in $o(g)$ or $\omega(g)$.

  • Technically, we can also have $f \in \Theta(g)$ if the limit $\frac{f(n)}{g(n)}$ as $n \to \infty$ doesn't exist because of oscillation, but the ratio is still bounded for all sufficiently large $n$. For example, we want $1 + \frac12 \sin n$ to be $\Theta(1)$, though the limit doesn't exist here. But this doesn't usually come up much.

Second, to answer your question of how we'd figure this out.

First, we should simplify as much as possible. Comparing $n$ to $n^{0.99} (\log n)^2$ can be reduced to comparing $n^{0.01}$ to $(\log n)^2$, which can in turn be reduced to comparing $n^{0.005}$ to $\log n$.

In practice, you just learn the asymptotic comparisons between common functions. One of them is that $\log n \in o(n^\alpha)$ for any $\alpha > 0$, which answers this question. But if we wanted to understand the comparison above better, we could also substitute $n = k^{200}$, which gives us the comparison $(k^{200})^{0.005} = k$ vs. $\log (k^{200}) = 200\log k$. This comparison goes the same way as $k$ vs. $\log k$, which is hopefully more clear-cut.

0
On

The plot is misleading: $x$ must become quite large before $x^{0.01}$ dominates over $\log(x)^2$, but eventually it does. The limit query is indeed correct.

As for the derivative, no, that's not the case, consider for a simpler example $x^\alpha$ for any $0<\alpha<1$.

0
On

$$\frac{x^{0.01}}{\log^2x}$$ evaluated at $x=e^{10000}$ gives about

$$2.7\cdot10^{35}.$$

Not so close to $0$...