Question We are given two sorting algorithms. The running time of Algorithm $1$ is $O(n^2)$ while Algorithm $2$ has running time $O(n \cdot log (n))$. Assume that Algorithm $1$ and Algorithm $2$ take $1$ second to sort out $1000$ numbers. How long time would it take for each algorithm to sort out a) $10^6$ numbers b) $10^9$ numbers?
Comments I realize this may be a simple question, but I'm just starting this material. A hint or a full answer would be appreciated.
You've seen two answers that illustrate that big-O timing estimates don't tell you much about actual running times, but to illustrate how big a difference the actual orders matter, let's dispense with the big-O estimates and suppose we actually knew that Algorithm 1 took precisely $c_1n^2$ seconds to sort $n$ numbers and Algorithm 2 took exactly $c_2n\log_{10} n$ seconds to sort $n$ numbers (I chose log to the base ten to make the calculations simpler; choosing a different base would only change the constant $c_2$).
If Algorithm 2 took 1 second to sort $10^3$ numbers, we'd have $$ 1=c_1(10^3)^2=c_1\,10^6 $$ so $c_1=1/10^6$. If Algorithm 2 also took 1 second to sort $10^3$ numbers, we'd have $$ 1=c_2(10^3\log(10^3))=c_2(3\cdot10^3) $$ so $c_2=1/(3\cdot10^3)$.
Now let's see what happens when we sort $10^6$ numbers. Algorithm 1 will take $$ \frac{1}{10^6}\cdot(10^6)^2=\frac{10^{12}}{10^6}=10^6\text{ seconds} $$ which works out to be about $11.5$ days. On the other hand, Algorithm 2 will take $$ \frac{1}{3\cdot10^3}\cdot(10^6\log10^6)=\frac{6\cdot10^6}{3\cdot10^3}=2\cdot10^3\text{ seconds} $$ which is about $33.3$ minutes. Hmm, 11 days versus half an hour. I know which algorithm I'd use.
For larger inputs the difference in times becomes even more dramatic. With $10^9$ numbers to sort, you can work out that Algorithm 1 would take about 31688 years, while Algorithm 2 would take about 34.7 days.
Notice that Algorithm 1 had a much smaller constant multiplier than Algorithm 2, but in the long run that didn't matter anywhere near as much as the asymptotic order of their running times. Simply said, an $n\log n$ algorithm will eventually clobber an $n^2$ algorithm, no matter what the constant multiple is.