give big-O estimate for f (n) = 3n log(n!) + (n 2 + 3) log n

7.9k Views Asked by At

Give a big-O estimate for f (n) = 3n log(n!) + (n^2 + 3) log n, where n is a positive integer.

Solution: First, the product 3n log(n!) will be estimated. From Example 6 we know that log(n!) is O(n log n). Using this estimate and the fact that 3n is O(n), Theorem 3 gives the estimate that 3n log(n!) is O(n^2 log n).

Next, the product (n^2 + 3) log n will be estimated. Because (n^2 + 3) < 2n^2 when n > 2, it follows that n^2 + 3 is O(n^2 ). Thus, from Theorem 3 it follows that (n^2 + 3) log n is O(n^2 log n). Using Theorem 2 to combine the two big-O estimates for the products shows that f (n) = 3n log(n!) + (n^2 + 3) log n is O(n^2 log n).

the Text in Bold is what i didnt get, i know that (n^2 +3) is O(n^2), but iant log n is O(n), and with combination rules (f1 f2)(x) = O(g1(x)g2(x))

which means O(n^2) * O(n) = O(n^3), but the text-book keeps saying its O(n^2 log n).!!

I am missing something but what is it ?

the example from the text-book " discrete mathematics and its applications 7th edition " Page 214

1

There are 1 best solutions below

0
On BEST ANSWER

It often happens that an algorithm has $\mathcal O(\log n)$ average-case and $\mathcal O(n)$ worst-case performance. (One example is searching for an element in a binary tree: this is usually $\mathcal O(\log n)$, but becomes $\mathcal O(n)$ when the tree is very unbalanced.)

You may have been confused by this to think that $\log n$ is somehow "inherently worst-case linear", but that's not true. There are plenty of algorithms where this doesn't happen.

Moreover, saying that "$\log n$ takes $\mathcal O(n)$ in the worst case" is a type error, in a sense. The function $f(n) = \log n$ is not an algorithm, and doesn't have a worst case: it is always only one value. We can say that $\log n = \mathcal O(n)$ in the same way that $\log n = \mathcal O(n^2)$ or $\log n = \mathcal O(2^n)$: $\mathcal O$-notation is always just an upper bound. But $\mathcal O(\log n)$ is more precise: not all functions in $\mathcal O(n)$ are also in $\mathcal O(\log n)$.

A function like $n^2 \log n$ grows faster than $n^2$ but slower than $n^{2.0000001}$; this is how you should think of log factors in $\mathcal O$-notation. You should keep $\mathcal O(n^2 \log n)$ as $\mathcal O(n^2 \log n)$: replacing the log by any polynomial would give a weaker estimate.