Solving recurrences of the form $T(n) = aT(n/a) + \Theta(n \log_2 a)$

581 Views Asked by At

The time complexity for the merge sort algorithm is $T(n) = 2T(n/2)+\Theta(n)$, and the solution to this recurrence is $\Theta(n\lg n)$.

However; assume you are not dividing the array in half for sorting, but instead in, say, $a$ pieces. Then the time would be something like $T(n) = aT(n/a) + \Theta(n \log_2 a)$. (The last term being the time to perform divide-and-merge for $a$ pieces with a total size of $n$. For positive, integer $a$, what is the solution to this recurrence? (assuming $T(n) = 1$ for $n = 1$ and that $n=a^x$ where x is a positive integer as well).

I have a hunch it would probably be $\Theta(n \log_2 n)$ for any $a$, but I'm not really sure how to prove it, with the logarithm being in a different base than $2$ .

2

There are 2 best solutions below

1
On BEST ANSWER

Use the Master Theorem, case 2.

1
On

This is the one piece of time-complexity theory that I remember from my undergrad Algorithms course =) The solution is known in [Cormen-Leiserson-Rivest] as the "master method". The text has the full formula, and I refer you there for more explanation.

The behavior of the recurrence $T(n) = aT(n/a) + \Theta(n)$ could be a bit different than that of $T(n) = aT(n/a) + \Theta(n\log_a n)$ due to the larger overhead of the latter. In the former case (including $a=2$ in your first line), the time complexity works out to $$ n^{\log_a(a)} \lg n = n \lg n,$$ as you have written. In the latter case, with overhead $\Theta(n \log_a n)$, we have to be more clever... [I'll write more later, as obligations are pulling me away from the computer at the moment].