Asymptotic complexity gives an idea of how rapidly the space/time requirements grow as problem size increases.
• Suppose we have a computing device that can execute 1000 complex operations per second.
Here is the size problem that can be solved in a second, a minute, and an hour by algorithms of different asymptotic complexity:
$$\begin{matrix} \text{Complexity} & 1 \text{second} & 1 \text{minute} & 1 \text{hour}\\ - & - & - & -\\ n & 1000 & 60.000 & 3.600.000 \\ \\ n\log n & 140 & 4893 & 200.000 \\ \\ n^2 & 31 & 244 & 1897 \end{matrix}$$
Could you explain me how we found the following? $$n \log n=1000 \Rightarrow n=140$$
The number 140 comes from $1000/\ln 1000 \approx 1000/6.91 \approx 144$ which was then rounded.
If you can do $n= 1000$ operations in 1, then for an $O(n\log n)$ algorithm, you need to do $1000 \log 1000$ operations to accomplish the same task, so it takes $\log 1000$ times longer than the $O(n)$ algorithm, so the size of problem that you can solve is a factor of $\log 1000$ times smaller. The logarithm used appears to be the natural logarithm in this case for the numbers to work out.