c) Suppose that algorithms A1 and A2 have computational complexities of $O(n \log(n))$ and $O(n^3)$ respectively. Both algorithms take approximately $100$ seconds to run on data input of size $n = 10,000$. Determine (to the nearest second) the time it will take each algorithm to run on data input of size
i. $n = 1,000$
ii. $n = 100$
I've been trying for the past few days to find some way to solve this but I'm completely stuck. Someone else had a similar problem but when I tried to use their method it didn't work. Please help I have an exam in this on Tuesday
Please bear in mind that I know pretty much nothing about computational complexity.
Let's call the runtimes for an input of size $n$ $A_1(n)$ and $A_2(n)$ respectively. We are given that $A_1(n) = O(n\log(n))$ which means that $\lim_{n \to \infty}\frac{A_1(n)}{n\log(n)} < \infty$. Therefore the limit is finite but we don't know what it is yet. We can figure it out by using the additional information we are given: $A_1(10) = 100$. Plugging in, we get $\frac{A_1(10)}{10\log(10)} \approx 4.3429$. Doing the same for $A_2$ we get $\frac{A_2(10)}{10^3} = 0.1$.
Now we can solve the problem:
$$\frac{A_1(1000)}{1000\log(1000)} = 4.3429 \implies A_1(1000) \approx 30\,000$$
$$\frac{A_2(1000)}{1000\log(1000)} = 0.1 \implies A_2(1000) = 100\,000\,000$$
The second part of the question works the same way.