When do we use log when calculating computational complexity

53 Views Asked by At

I have difficulty with the following question.

When Tom is using a calculator, he types in a random number x. And he finds that after pressing the sqrt button for a few times, the calculator displays 1 when the value is between 1 and 1+w where w is the numerical precision. Show that after pressing for $\Theta(log log (x) + log \frac{1} {w} )$, where log is in base 2, Tom gets 1.

I have no idea where does the $loglog$ thing come from. Thank you.

1

There are 1 best solutions below

3
On

If you take the squareroot $k$ times, you are really just taking $x^{1/2^k}$. So the problem is how large does $k$ need to be for $x^{1/2^k}<1+w$. Taking $\log_2$ of both sides, this equivalent to $\log_2(x)/2^k<\log_2(1+w)$, or $\log_2(x)/\log_2(1+w)<2^k$. Taking $\log_2$ again, you see that you need $k>\log_2(\log_2(x)/\log_2(1+w))=\log_2\log_2(x)-\log_2\log_2(1+w)$. Note that for small $w$, $\log_2(1+w)=\Theta(w)$, so the right side becomes $\Theta(\log\log x+\log(1/w))$.