Time complexity for computing sum of first $n$ squares

1.2k Views Asked by At

I'm trying to compute time complexity for computing sum of first $\mathbf{n}$ squares. Actually, this is a problem from the textbook (A course in number theory and cryptography).

The question is to compute time complexity for LHS and RHS of the formula: $\sum_{j=1}^n j^2 = \frac{n(n+1)(2n+1)}{6}$

The answers from the textbook says $O(n{log}^2{n})$, but I found tighter bound for the algorithm as $O(log^2{n})$.

The reason is: We think of an algorithm as summing $1 \times 1$, $2 \times 2$, ..., $n\times n$ (Total of $n$ steps)

On $j$-th step, the length is $O(log^2{j})$. Summing $n$ terms, $O(log^2{1})+O(log^2{2})+...+O(log^2{n})\leq C*O(log^2{n})$, where $C$ is a constant.

Therefore, I got $Time(LHS)=O(log^2{n})$.

Is this correct? if I got wrong, could you tell me why?

thanks a lot :)

1

There are 1 best solutions below

3
On

The problem is that your $C$ isn't a constant, it depends on $n$. The larger $n$ is, the more terms you have, and the sequence $\ln^2 n$ doesn't grow fast enough for the final few terms to dominate. This means that your $C$ must grow with $n$.