Big O, Omega and Theta Notation

1.6k Views Asked by At

For time complexity I get that:

  • O(n) = worst case

  • $\Omega$(n) = best case

  • $\Theta$(n) = exactly (best and worse)

But I'm facing the following:

"State weather the statement is true or false. If true, provide justification, and if false, give the correct relation."

  1. $f(n) = log\ n, g(n) = \sqrt n\ $and $ f = O(g)$
  2. $f(n) = 2^n$, $g(n) = 4^n$, and $f = \Omega(g)$
  3. $f(n) = n^3 - n^2$, $g(n) = n^2$, and $f = O(g)$
  4. $f(n) = 2n + log\ n$, $g(n) = 2n\ log\ n$, and $f = \Theta(g)$
  5. $f(n) = log_2\ n$, $g(n) = log_{10}\ n$, and $f = \Theta(g)$

Is this anything more complicated than plugging in values for n to see which one grows faster?

E.g.

  1. True. E.g. if $n = 256$, both $log_2$ and $log_{10}$ of 256 (8, 2.4) are $ < 16$,
  2. False. $f = O(g)$ because $g(n) > f(n)$ for all positive values of $n$.
  3. False. $f = \Omega(g)$ At large values of $n, n^3$ will dwarf $n^2$.
  4. False $f = O(g)$ since $2n + log\ n$ is much smaller than $2n log\ n$.
  5. True. Regardless of the log base they will both grow at the same rate.

Am I doing this right? Or am I missing something?

2

There are 2 best solutions below

5
On BEST ANSWER

Apparently the way to answer these is the following 3 identities:

$f = O(g)$ if $\lim_{n \to \infty} \frac{f}{g} < \infty$

$f = \theta(g)$ if $\lim_{n \to \infty} \frac{f}{g} = 0$

$f = \Omega(g)$ if $\lim_{n \to \infty} \frac{f}{g} > 0$

Less formally:

  • if $f$ grows slower than $g$, then $\Omega(g)$
  • if $g$ grows faster than $f$, then $O(g)$
  • if $f$ and $g$ grow at the same rate, then $\Theta(g)$

Back to the five:

(a):

$lim_{n \to \infty} (\frac{log\ n}{\sqrt n})$

True. $f$ will always grow slower than $g$, so $f = O(g)$.

(b):

$\frac{2^n}{4^n}$ can be rewritten as $(\frac{2}{4})^n$, which is $(\frac{1}{2})^n$, which clearly heads toward $0$ as $n$ goes to $\infty$, so this is false, $f != \Omega(g)$, $f = O(g)$.

(c):

$\frac{n^3-n^2}{n^2}$ Factor out the $n^2$ and cancel and you get $n-1$. That shows $f$ grows faster than $g$ hence (c) is false, $f != O(g)$, instead $f = \Omega(g)$.

(d):

$\frac{2n+log\ n}{2n\ log\ n}$ $f$ will always grow slower than $g$, hence this is false, $f = O(g)$ not $\Theta(g)$.

(e):

$\frac{log_2\ n}{log_{10}\ n}$ While $log_{10}$ is 'bigger' than $log_2$ they grow at the exact same rate hence this is true, $f = \Theta(g)$.

In other words, I had the right idea, but just plugging in big values isn't a legitimate way to justify the answer since functions can vary at different values of n.

Ideally you could use fancier algebra to reduce these expressions but the general idea for those of us who don't need mathematical formalism to function (no pun intended) it's a matter of identifying which grows faster, then justifying it.

1
On

Definition of Big O is not that simple as (worst case) even when this is the purpose of this notation. To prove some function f is O(g) you need to prove that exist some value $x_0 > 0$ and $M > 0$ such that $\forall x > x_0$ $| f(x)| < M *| g(x) |$

Check the formal definition on wikipedia