Estimating asymptotic runtime of recurrence-equation given recursive-tree

250 Views Asked by At

Given the following recurrence equation: $T(n) = T(n/2) + T(n/3) + n$

I need to use a recursive-tree to come up with an estimated $O$-notation for the runtime of this recursive algorithm, given above.

Without including the exact tree (which is hand drawn), what I find is a geometric series describing the runtime, with a factor of $\frac{5}{6}$, such that the following levels appear (starting from the root): $$n + \frac{5n}{6} + \frac{25n}{36} + \frac{125n}{216} ...$$

With a geometric series, we can say that it is the lowest level that dominantes the runtime of the algorithm. Since the factor is just a constant, is it correct to assume that in the above case, and estimated guess would be $O(n)$?

What confuses me I think is the depth/height of the recursive-tree is actually $lg(n)$, but given its a geometric series, is it correct to assume this has no influence on the runtime at all? Only looking at the lowest level which is a factor $n(\frac{5}{6})^x$, for some constant $x$ sufficiently large.

1

There are 1 best solutions below

0
On

I think you can prove linear running time, yes. I understand that the height of the tree can be confusing. To make this intuitive: At every level of the tree consider the total running time of this level. For example at the top level (where the root is at) we have total running time $n$. In the level below the root we have total running time $\frac{n}{2} + \frac{n}{3} = \frac{5n}{6}$. As you noticed the total running time decreases by a constant factor $\frac{5}{6}$ whenever you go down one level. This leads to a geometric series and you get total running time $\mathcal{O}(n)$ if you sum up the running times of the individual levels.

I think it might help to look at an example where this does not work: Consider the recursion $T(n) = 2T(\frac{n}{2}) + n$. The running time at every level of the tree is exactly $n$. Thus we don't have this nice reduction by a constant factor when we go down the tree. Summing over all the levels gives us $\mathcal{O}(n \log(n))$ running time.

You might notice the following: For a recursion of the form $T(n) = aT(\frac{n}{b}) + n$ analyzing the factor $\frac{a}{b}$ will tell you all about the asymptotics of $T(n)$. If you have that $\frac{a}{b} < 1$ then the running time will decrease by a constant factor for every level of recursion (for every level we go down in the tree) and thus we will get linear running time. On the other hand, if $\frac{a}{b} = 1$ then you get $T(n) = \mathcal{O}(n \log(n))$.

Note that this is just a special case of the so-called master theorem.