time complexity of recursive sum function

1.1k Views Asked by At

I have here a function (written in python), which computes the sum of all numbers from $a$ to $b$. I'd like to know how to find mathematically it's time complexity (without using the master theorem). How can one do that?

Here's what I've tried so far:

I managed to express the left recursive side as: $\frac{\left(2^{k}-1\right)a+b}{2^{k}}$ (after writing the mid as a sum like so $\frac{a+\frac{a+\frac{a+b}{2}}{2}}{2}$ (the fraction may go further for large $k$-s).

comparing the resulting formula to $a$ as a base case for the function to stop does not yield results: $$\frac{\left(2^{k}-1\right)a+b}{2^{k}}=a\Rightarrow-a+b=0\Rightarrow a=b$$ which brings me back to the "starting point" and gives no new information about $k$. I'd really appreciate your help :)

1

There are 1 best solutions below

5
On

An easy approach is by noticing that the two recursive calls apply to two subintervals that partition the input interval. As the recursion stops on singletons, the function will exit immediately exactly n = b-a+1 times (the number of singletons in [a,b]).

But any call involves at most two subcalls, so that the total number of calls cannot exceed

n + n/2 + n/4 + n/8 + ... < 2n.

Clearly, that fuction takes time Θ(n).