Why is $O(\log(\log(n)))$ upper bound $\Theta(\log(n))$?

3.1k Views Asked by At

This question more has to do with math than computer science, but there is a computer science component to it.

If I have an algorithm with a time complexity function $$O(\log(\log(n)))$$Why would its upper bound be$$\Theta (\log(n))$$I do not follow the rules of logarithms on this, what steps were used to arrive at this answer?

2

There are 2 best solutions below

0
On BEST ANSWER

Suppose that $f$ is $O(\log(\log n))$, and $g$ is $\Theta(\log n)$. Then there are positive constants $c_0$ and $c_1$ and an $m\in\Bbb Z^+$ such that

$$f(n)\le c_0\log(\log n)$$

whenever $n\ge m$ and

$$g(n)\ge c_1\log n$$

whenever $n\ge m$. Moreover,

$$\lim_{n\to\infty}\frac{\log(\log n)}{\log n}=0\;,$$

so we may further assume that $m$ is large enough so that

$$\frac{\log(\log n)}{\log n}\le\frac{c_1}{c_0}$$

whenever $n\ge m$.

But then for $n\ge m$ we have

$$f(n)\le c_0\log(\log n)\le c_1\log n\le g(n)\;,$$

i.e., $g$ is eventually an upper bound for $f$.

0
On

I think you may have used the wrong notation. Big-Theta is "bounded below and above asymptotically".

If you're saying whether or not $c \log n > d \log (\log n)$ for some $d$ and all $n > n_0$, then it's true and here's why:

Let's take a look at the function $f(x) = x$, it grows faster than $g(x) = \log x$. You should at lease be familiar with this fact.

Consider now that $\log x$ is a monotonically increasing function, $u > v \implies \log u > \log v$

So, if $f(x) > g(x) \implies \log f(x) > \log g(x) \implies \log x > \log (\log x)$

Obviously, all the above are true for points $x > x_0$, but not for all $x$. The above logic follows when considering those appropriate $x$ values.

Now, it's on you to apply the logic to functions which are $O$ and $\Theta$ respectively.