How can I tell/compare the asymptotic complexity of a function?

2.1k Views Asked by At

For something, like a quadratic I just take the highest degree and see if it is $\theta(n)$ or $O(n)$ or $\Omega(n)$, correct? So like $2n^2+2n+1$ could be $\theta(n^2)$.

What are the general guidelines/strategies for more complex formulas? For example, something like the $\sum_{i=1}^{n^2}(\frac{3}{4})^i$. Is it gonna be $n^2$ again?

What if I have something like $4lg(n) * 2lg(n)$? Would it be $lg(n)^2$?

And as far as comparing these complexities, should I just graph them until I start remembering which ones grow faster than others?

Links to examples would also help. Thank you so much!

1

There are 1 best solutions below

0
On

A good reference for how to think about these things is Graham, Knuth, and Patashnik's Concrete Mathematics. I'll work these examples in detail to give you an idea of how one might proceed.


The first example is

$$ f(n) = 2n^2 + 2n + 1. $$

When $n \geq -1/2$ we have $2n+1 \geq 0$ so that

$$ f(n) \geq 2n^2, $$

and when $n \geq 1$ we know that $n \leq n^2$ so that

$$ f(n) \leq 2n^2 + 2n^2 + n^2 = 5n^2. $$

Thus

$$ 2n^2 \leq f(n) \leq 5n^2 $$

for $n \geq 1$, which is sufficient to conclude that $f(n) \in \Theta(n^2)$ for $n \geq 1$.

An alternate way to see this is to notice that

$$ \lim_{n \to \infty} \frac{f(n)}{2n^2} = 1, $$

so that for any $\epsilon > 0$ it be true that

$$ 1-\epsilon \leq \frac{f(n)}{2n^2} \leq 1+\epsilon $$

for $n$ large enough. Thus

$$ 2(1-\epsilon)n^2 \leq f(n) \leq 2(1+\epsilon)n^2 $$

for $n$ large enough, which is sufficient to conclude that $f(n) \in \Theta(n^2)$ as $n \to \infty$.


The second example is

$$ g(n) = \sum_{i=1}^{n^2} \left(\frac{3}{4}\right)^i. $$

The first thing to notice is that $\lim_{n \to \infty} g(n)$ actually exists; it is

$$ \lim_{n \to \infty} g(n) = \sum_{i=1}^{\infty} \left(\frac{3}{4}\right)^i = \frac{3/4}{1-3/4} = 3 $$

by the formula for the geometric series. This means that $g(n)$ is of constant order. Indeed, for any $\epsilon > 0$ it will eventually be true that

$$ 3-\epsilon \leq g(n) \leq 3+\epsilon $$

for $n$ large enough, and this is sufficient to conclude that $g(n) \in \Theta(1)$ as $n \to \infty$.

The appearance of the $n^2$ here (as opposed to just $n$ or even $\log n$) doesn't really factor into the leading-order behavior. It would, however, show up in the calculation of the error $g(n)-3$.


The last example, if I understand correctly, is

$$ h(n) = 4\log n \times 2\log n = 8 (\log n)^2. $$

This is indeed $\Theta\Bigl((\log n)^2\Bigr)$.


For intuition on complexities I highly recommend section 9.1 of Concrete Mathematics. You may also be interested in this question and answer.