Good evening, everyone.
I know this would be pretty easy for a lot of you, but I haven't been able to find any information, how to solve things common to this one.
- For each function f (n) (the complexity of the algorithms) and data n in the table below, determine the time t for which the algorithm will perform the calculations. We assume that the algorithms are processed on a computer capable of performing one million operations per second. You can use approximations in your answers.
I would be really greateful if anybody will explain how to solve it.
| n | 10 | 1000 | 1000000 |
|---|---|---|---|
| lg n | |||
| n | |||
| n lg n | |||
| $$n^2$$ | $$\frac{1}{10000} seconds.$$ | ||
| $$2^n $$ |
I tried, but I'm just not sure what to do. Thank you in advance for any assistance.
In the second column, plug in $n=10$ into the expressions in the first column to get the number of operations. Then divide by 1000000 operations/second to get the number of seconds. Repeat for the other columns for the other values of $n$.
For the provided example: plugging in $n=10$ into $n^2$ yields $10^2=100$ operations, which takes $\frac{100}{1000000} = \frac{1}{10000}$ seconds.