How to evaluate growth of input size from n to 2n in this case?

4.3k Views Asked by At

This is the question I am currently working on
What is the effect in time required to solve a problem when you double the size of the input from n to 2n, assuming that the number of milliseconds the algorithms to solve the problem with input size n is each of the following functions? Express in simplest form possible.
a)log log n
b)n$^3$

Here is how I was able to evaluate part b. If you have a size of n, the runtime would just be n$^3$. If you have a size of 2n, the runtime would be (2n)$^3$ or 8n$^3$. To find the effect in time, I found the scale factor by doing $\frac{8n^3}{n^3}$ which reduced is just 8. By doubling the size of input from n to 2n, the runtime increases by a scale of 8.

How would you do a similar analysis with log log n? With a size of n, the runtime would be
log log n. And with a size of 2n, the runtime would be log log 2n. How would you simplify $\frac{log log 2n}{log log n}$ though in the same way?

log x analysis After comment by Ross Millikan

$\frac{log 2n}{log n} = \frac{log(2) + log(n)}{ logn} = \frac{log(2) }{ logn} + 1$

1

There are 1 best solutions below

9
On BEST ANSWER

For $\log n$ I think the simplest way to express the increase in processing time when the input doubles is to note $\log (2n) - \log n=\log n + \log 2 - \log n=1$, so doubling the input increases the processing time by a constant amount.

For $\log \log n$ I think it is hard to express the increase, but the intuitive thing is that it doesn't matter at all when the input size doubles. If we try to do the same thing as for $\log n$, we get $\log (\log(2n))-\log (\log(n))=\log(1+\log n)-\log(\log(n))$ but that doesn't seem very enlightening. Maybe it is useful to write $\log(1+\log n)-\log(\log(n))\approx \frac d{d \log n}\log(\log(n))\frac {dn}{d \log n}\frac d{dn}=\frac 1{ \log(n)\ln(2)^2}$ so the added work of doubling the input goes as $\frac 1{\log n}$