I've tried to find answers on this but a lot of the questions seem focused on finding out the time complexity in Big O notation, I want to find the actual time.
I was wondering how to find the running time of an algorithm given the time complexity of it.
For example: An algorithm runs in $O(n \lg n)$ time and solves a problem of size 1000 in 23 seconds. It would solve a problem of 10000 in slightly over...
Or, for another example, a comparison between two:
- Suppose you have a computer that requires 1 minute to solve problem instances of size n = 100. What instance sizes can be run in 1 minute if you buy a new computer that runs 100 times faster than the old one, assuming the following Time complexities $T(n)$ for the algorithm?
(a) $O(n^2)$
(b) $O (2^n)$
Thanks, I'm a bit baffled.
You cannot know the exact time of a given order, but let me reformulate your question to be, what is the upper bound a given order for a given value of $N$.
For example,
We know that
Then an upper bound of $O(N^2)$ with $N = 100$ is $100 \log(100) = 100\cdot 6.64 = 664$
Now depending on the speed of the computer, you can determine how much time this will take.
You can do a simple application that makes 664 iterations, then calculate the time it takes.