We have the following two algorithms and need to determine the big O-notation:
Algorithm 1 $$ \sum_{i=1}^{n} c \cdot x_i $$
Algorithm 2 $$ c \cdot \sum_{i=1}^{n} x_i $$
Although I studied the topic some hours, I am not really comfortable with the analysis of algorithm, so I would be happy, if you could give me some inputs on it:
I started with the second one, as it looks easier to me. If we have an array of length n we sum over every element so that would give O(n). Then we multiply once with a constant c. Now I am struggling with the analysis. Does that mean that we add 1 to our big O notation, hence O(n+1)? But I learned that we can drop constants so that would lead us back to O(n).
The second algorithm was more difficult to me. I think it takes longer to execute it, because every element in the array is multiplied with c and only then we can add it to the sum. So intuitively I would say that we double the effort. The best idea I can come up with, would be O(2n). As every step element in the array is multiplied and summed up. But also here we can drop the constant, right? So it would also be O(n).
I would be happy, if you could help me, streamline my thoughts in the right direction. The exercise suggests that they yield the same result, but differ in their complexity, so my analysis must somehow be wrong.
The exercise also suggests to think about time and space complexity, which it denotes as T and O. Now suppose we find that both would be O(n), what is then T? Or in general, what is difference between those two in this specific example?
Thanks a lot for you help!
You are generally correct, but off by one in some of your calculations.
You are completely incorrect here though:
This is not true. Two algorithms can have the same big-o complexity yet one can be vastly faster than the other.