I'm working on my homework and I saw this problem:
A divide-and-conquer algorithm $ X $ divides any given problem instance into exactly two smaller problem instances and solve them recursively. Furthermore, whenever $ X $ divides a given problem instance of size $ n $ into smaller instances with sizes $ a $ and $ b $, then $ a + b = n $ and $ \frac 1 4 n \le a , b \le \frac 3 4 n $.
- If the splitting and merging procedures require $ 2 n $ steps for a problem instance of size $ n $, derive the asymptotic running time $ T ( n ) $ (in $ \Theta ( ) $).
- If the splitting and merging procedures require $ n ^ 2 $ steps for a problem instance of size $ n $, derive the asymptotic running time $ T ( n ) $ (in $ \Theta ( ) $).
I know that I have to use the master theorem after getting the recurrence equation, but I have no idea how to state the form of $ T ( n ) $. In other examples I've seen something like "it splits in three parts" which is understandable, but the $ a + b $ part makes me confused. Can someone help me please? Thanks!!
Well, $T(n) = T(a) + T(b) + 2n$ in the first case, with $a,b$ bounded as you mentioned