what is the solution of following recurrence relation? $$\begin{align} T(1) &= 1\\ T(n) &= T(\log n) + \mathcal O(1) \end{align}$$
a) $O(log n)$
b) $ O (log^* n) $
c) $ O (log^2 n) $
d) $ O (n / log n) $
what is the solution of following recurrence relation? $$\begin{align} T(1) &= 1\\ T(n) &= T(\log n) + \mathcal O(1) \end{align}$$
a) $O(log n)$
b) $ O (log^* n) $
c) $ O (log^2 n) $
d) $ O (n / log n) $
On
The best way I know how to do this is to solve by guess and inductive proof. For example, you're already wondering if the function is $\mathcal{O}(log(n))$.
$T(n) \in \mathcal{O}(f(n))$ if there exist constants $c$, $n_0>0$ such that $T(n) \leq cf(n))$ for all $n >n_0$.
I'm going to replace the $O(1)$ with 1 (you can do this with an unspecified constant too). So I'm going to assume log base 2, and I'm going to assert that $T(n) \leq 2lg(n)$. My $n_0 =2$. $T(2) = T(1)+ 1 = 2 \leq 2lg(2) =2$ holds. Assume this holds for all $n'< n$ Then (working backwards): $T(n) = T(lg(n))+1 \leq 2lg(lg(n))+1= 2lg(lg(n) \sqrt{2})$. Now we need to show $lg(n) \sqrt{2} \leq n$ (it is).
Another way to solve is to solve by unrolling: Starting with $n$, how many times can you take the log until you get to 1? About $log(n)$ times. Write out the sum for what will happen at each step, and you'll notice you get a sum of a constant $log(n)$ times. I'd still use the inductive proof to show that it's true.
Also, there's the master theorem, but I wasn't sure how to apply it in this case.
I like to think of recurrences of the form $T(n) = T(f(n)) + \Theta(1)$† as "accumulator" recurrences. This is because they "add up" the number of iterations--or some multiple thereof--it takes for $f(n)$ to go beyond some limit.
In words, this recurrence is determining the number of times one must take a logarithm before reaching $1$. This should sound familiar to a computer science student, as there's a special function to describe this number!
This special function is $\lg^*n$, which is the iterated logarithm. Since some may not be familiar with this function, it is defined as: $$\lg^*n:=\begin{cases} 0& n\leq 1\\ 1+\lg^*(\lg(n)) & n > 1 \end{cases}$$
So, a good guess would be that $T(n) \in \Theta(\lg^*n)$. (In fact, if we had $+1$ instead of $+\Theta(1)$, the exact solution would be: $T(n) = \lg^* n + 1$.) The guess can be proven somewhat easily using strong induction.
†I must make an important distinction: if you are really talking about big-$\mathcal O$ notation, there isn't a unique "right answer" to this question. This is because $\mathcal O(\cdot )$ is an upper bound for the asymptotic behavior of the recurrence, rather than a tight bound. People often confuse $\mathcal O(\cdot)$ with $\Theta(\cdot)$. As such, I'm assuming that the you meant $\Theta(\cdot)$ wherever you wrote $\mathcal O(\cdot)$.