Considering the following algorithms:
int F (int N) {
int n, i, sum=0;
for (n=N; n>0; n=n/2)
for (i=0; i<n; i++)
sum++;
return sum;
}
int G (int N) {
int n, i, sum=0;
for (i=1; i<=N; i=2*i)
for (j=0; j<i; j++)
sum++;
return sum;
}
Why the number of times sum++ is executed is 2N-1 in both cases? How could I reach this result using summations?
Thanks in advance!
Just try some examples to see why the worst-case complexity is $O(N)$.
For the first code, you have two nested for loops.
for i = N downto 1and for each iterationithe loop halves, i.e.,i=Ntheni=N/2, etc.for j = 0 to iand you increasejby1.So, we have:
i=N, you would need to run the second loop fromj=0 to N. This costs, roughly,Niterations. Now,i=N/2.i=N/2, you would need to run the second loop fromj=0 to N/2. This costs, roughly,N/2iterations. Now,i=N/4.i=N/4, you would need to run the second loop fromj=0 to N/4. This costs, roughly,N/4iterations. Now,i=N/8.i=N/8, you would need to run the second loop fromj=0 to N/8. This costs, roughly,N/8iterations.You would have to do these steps, 1., 2., 3., 4., etc., until $i=1$. It is not hard to see that the number of steps is equal to $\lg N$, since you start with $N$ and you half it each time until you get $1$.
Hence, the number of steps is, roughly, equal to $\lg N$.
Now, how much does each step costs? Well we can easily see that step $i$ costs, roughly, $N/2^{i-1}$ iterations (in the second loop for
j). Therefore, the worst-case complexity of the first code is, roughy:Cost of each step x how many stepswhich can be calculated as: \begin{align} \underbrace{N+N/2+N/4+N/8+\ldots+1}_{\lg N\text{ times }} &= N\sum_{i=0}^{\lg N}\dfrac{1}{2^i}\\ &\overset{(a)}{=}N\dfrac{1-(1/2)^{\lg N + 1}}{1-1/2}\\ &=2N(1-(1/2)(1/2)^{\lg N})\\ &=N(2-1/N)\\ &=2N-1=O(N), \end{align} where $(a)$ is due to the geometric sum.Finally, the complexity is $O(N)$ which is linear.
You can prove the same thing for the second code.