Why is the time complexity of summing from 1 to n $O(n^2)$?

755 Views Asked by At

I was just recently introduced to Big-O notation and this confused me.

When summing from 1 to n, why is the time complexity O(n2)?

In other words, why is $\sum_{i=1}^n i = 1 + 2 + ... + n = \frac{n(n+1)}{2} = O(n^2)$?

This is a screenshot from the course that shows the above equalities.

Intuitively, I think it should be O(n) since n is the largest factor and the rest are constants so when added, it seems it should produce O(n).

The following is from a lesson that extracts this idea with comments from the instructor that follow the code:

def sum_odd(L):
M = L[:]
total = 0
while M != []:
    if M[0] % 2 == 1:
        total = total + M[0]
    M = M[1:]
return total

"The runtime of the loop depends on the iteration number. The slowest line is line 7. This line of code takes n−1 time the first iteration, n−2 time the second iteration, and so on until the second last iteration where it takes 1 operation to make a list of 1 element and on the last iteration, this makes the empty list, which we can also say takes 1 operation. The sum total of these number of operations is O(n2). Hence the runtime is O(n2) for the entire function since line 7 of the loop is the slowest part of this code."

2

There are 2 best solutions below

2
On BEST ANSWER

Writing $1 + 2 + \ldots + n = \frac{n(n+1)}{2} = O(n^2)$ is not talking about the complexity of performing the addition, it is saying that the function $\frac{n(n+1)}{2}$ belongs to the complexity class $O(n^2)$.

We say that $f(n) = O(g(n))$ or $f(n) \in O(g(n))$ if there exists $n_0$ and $M > 0$ such that $f(n) \leq Mg(n)$ whenever $n \geq n_0$. In other words, we can find a fixed scaling factor $M$ so that $f(n)$ is bounded by $Mg(n)$ when $n$ is big enough. For $f(n) = \frac{n(n+1)}{2}$ and $g(n) = n^2$, we could take, for example, $n_0 = 1$ and $M = 1$ and notice that:

$$\begin{eqnarray} n^2 - \frac{n(n+1)}{2} & = & \frac{1}{2}(2n^2 - n^2 - n) \\ & = & \frac{1}{2}(n^2 - n) \\ & = & \frac{1}{2}n(n-1) \end{eqnarray}$$

which, for $n \geq 1$, is clearly going to be $\geq 0$, and so we have $g(n) \geq f(n)$ which is enough to prove that $f(n) = O(g(n))$. More generally, you can prove that if $f$ and $g$ are polynomials and $g$ has the same or higher degree than $f$, then $f(n) = O(g(n))$.

The point of showing this is that if you have an algorithm that can be broken into parts of size $1, 2, 3, \ldots, n$ (e.g. an outer loop that runs from $m = 1$ to $n$ and an inner loop that runs from $i = 1$ to $m$), then the total size of those parts is going to be $1 + 2 + \ldots + n = \frac{n(n+1)}{2}$, and so the overall time complexity of the algorithm is $O(n^2)$.

The algorithm given is trying to demonstrate one such example. The loop runs $n$ times, and in the $m$th iteration the list $M$ is of length $n - m + 1$ (i.e. it starts at length $n$, then becomes length $n - 1$, and so on until it's of length 1). Then in the $m$th iteration the operation $M = M[1:]$ involves creating a list of length $n - m + 1$ which is taken to be of the same complexity as its size, i.e. it's equivalent to a loop of that number of iterations. So we are running something that involves $n$ operations, then $n-1$, then $n-2$, and so forth. If we ignore the other steps in the loop, that means that the algorithm as a whole has $n + (n-1) + (n-2) + \ldots + 1$ operations, which is equal to $\frac{n(n+1)}{2}$ and hence is $O(n^2)$, and we can show that the other parts only add $O(n)$ extra operations to the whole thing which doesn't change the overall complexity.

4
On

You are confusing functions (such as the function describing the runtime of an algorithm) with algorithms themselves. $O$-notation applies to functions, not algorithms.

When analyzing algorithms, we consider the function $T(n)$ that gives the algorithm's runtime on inputs of size $n$. If $T(n)=1+2+\dots+n$, this might correspond to an algorithm that performs one step, then 2 steps, then 3 steps, ..., then $n$ steps. (It has nothing to do with an algorithm for adding the numbers from $1$ to $n$.) This function grows as a quadratic function of $n$, i.e. $T\in O(n^2)$.