How long time will it take to sort $10^6$ numbers and $10^9$ numbers for two algorithms if they take the same time to sort $1000$ numbers?

1.2k Views Asked by At

Question We are given two sorting algorithms. The running time of Algorithm $1$ is $O(n^2)$ while Algorithm $2$ has running time $O(n \cdot log (n))$. Assume that Algorithm $1$ and Algorithm $2$ take $1$ second to sort out $1000$ numbers. How long time would it take for each algorithm to sort out a) $10^6$ numbers b) $10^9$ numbers?

Comments I realize this may be a simple question, but I'm just starting this material. A hint or a full answer would be appreciated.

3

There are 3 best solutions below

0
On BEST ANSWER

You've seen two answers that illustrate that big-O timing estimates don't tell you much about actual running times, but to illustrate how big a difference the actual orders matter, let's dispense with the big-O estimates and suppose we actually knew that Algorithm 1 took precisely $c_1n^2$ seconds to sort $n$ numbers and Algorithm 2 took exactly $c_2n\log_{10} n$ seconds to sort $n$ numbers (I chose log to the base ten to make the calculations simpler; choosing a different base would only change the constant $c_2$).

If Algorithm 2 took 1 second to sort $10^3$ numbers, we'd have $$ 1=c_1(10^3)^2=c_1\,10^6 $$ so $c_1=1/10^6$. If Algorithm 2 also took 1 second to sort $10^3$ numbers, we'd have $$ 1=c_2(10^3\log(10^3))=c_2(3\cdot10^3) $$ so $c_2=1/(3\cdot10^3)$.

Now let's see what happens when we sort $10^6$ numbers. Algorithm 1 will take $$ \frac{1}{10^6}\cdot(10^6)^2=\frac{10^{12}}{10^6}=10^6\text{ seconds} $$ which works out to be about $11.5$ days. On the other hand, Algorithm 2 will take $$ \frac{1}{3\cdot10^3}\cdot(10^6\log10^6)=\frac{6\cdot10^6}{3\cdot10^3}=2\cdot10^3\text{ seconds} $$ which is about $33.3$ minutes. Hmm, 11 days versus half an hour. I know which algorithm I'd use.

For larger inputs the difference in times becomes even more dramatic. With $10^9$ numbers to sort, you can work out that Algorithm 1 would take about 31688 years, while Algorithm 2 would take about 34.7 days.

Notice that Algorithm 1 had a much smaller constant multiplier than Algorithm 2, but in the long run that didn't matter anywhere near as much as the asymptotic order of their running times. Simply said, an $n\log n$ algorithm will eventually clobber an $n^2$ algorithm, no matter what the constant multiple is.

0
On

As it is stated, the answer is "we don't know". It follows from the $O(\cdot)$ notation — roughly speaking, you don't have enough information to infer anything. You only know the asymptotic behavior, and not even the constants.

Algorithm 1 might very well be as fast as Algorithm 2 as long as $n$ is less than, say, $10^{10}$, and then break down.

Even worse, the big-Oh notation is only an upper bound. For all we known, both could have the same running time $O(n\log n)$, since if Algorithm 1 had running time $O(n\log n)$, it would also trivially have running time $O(n^2)$.

1
On

Mathematically, big-$O$ notation doesn't work like this. $O(n^2)$ denotes an equivalence class of functions based on their asmyptotics.

So, for example, $f$ defined by $f(n)=n^2$ belongs to $O(n^2)$. So does the function $g$ defined by $g(n)=10^{1000}n^2$.

So, two different algorithms that take time $O(n^2)$ can still have vastly different performance for a specific task. The big-$O$ notation is meant to capture that scalability of these algorithms as $n \rightarrow \infty$.


However, if you're using this for a practical application, the multiplicative constant out the front will probably not be too large. Ignoring the $O(..)$ can often give a reasonable intuition as to how long the algorithm will take.

So, we might approximate that the $O(n^2)$ algorithm would take time $(10^6)^2=10^{12}$ to process $10^6$ items and the $O(n \log n)$ algorithm would take time $10^9 \log {10^9} \approx 2 \times 10^{10}$ to process $10^9$ items.

In most real-world cases, the comparison won't be too bad, but on the other hand, it might also be horrible.