Asymptotics of $\sum\limits_{i=1}^n i\log i$

6.8k Views Asked by At

I'm practicing complexity analysis for a problem and came across:

$$\sum_{i = 1}^{n} {\log(i!)}$$

I know that $\log(i!) = \Theta(i\log(i))$ so I further simplified to the sum to:

$$\sum_{i = 1}^{n} {i\log(i)}$$

Breaking this out gives me:

$$1\log(1) + 2\log(2) + 3\log(3) ... + n\log(n)$$

Since we have $1 + 2 + 3 + ... + n$ terms I believe there will be $\frac{n(n+1)}{2}$ terms, thus simplifying the sum to:

$$\frac{n(n+1)}{2}\sum_{i = 1}^{n} {\log(i)}$$

and from my research $$\sum_{i = 1}^{n} {\log(i)} = \log(n!)$$

this giving me the sum total:

$$\frac{n(n+1)}{2}\log(n!) = O(n^2\log(n!))$$

However given my other question on SO it appears this calculation is incorrect?

Could someone clear this up?

3

There are 3 best solutions below

2
On BEST ANSWER

Your reasoning goes awry when you say "I believe there will be $\frac{n(n+1)}{2}$ terms." (actually even before, as we will see.) You only have $n$ terms, and cannot take the number "outside" the sum.

The $i$th term is $i\log i$, it is not $\log i$ or anything else. At that point, there is no "simplifying and taking outside the sum" to do — you have to deal with the summands as they are.


Now, for the original question, and more on your approximation: you would have to justify that. Namely, it is very handwavy, and it's better to make it precise.

You have $$ \sum_{i=1}^n \ln(i!) = \sum_{i=1}^n \sum_{j=1}^i \ln j = \sum_{j=1}^n \sum_{i=j}^n \ln j = \sum_{j=1}^n (n-j+1) \ln j = (n+1)\sum_{j=1}^n \ln j - \sum_{j=1}^n j\ln j \tag{1} $$ which is not a first glance the same thing...$^{(\dagger)}$ Let's deal with each term separately.

  • As we know (see for instance this other question that $ \sum\limits_{j=1}^n \ln j = n \ln n + O(n) $, the first term is $$ (n+1)\sum_{j=1}^n \ln j = n^2\ln n + O(n). \tag{2} $$

  • By a comparison series/integral (since $f$ defined by $f(x) = x\ln x$ is monotone, and nice; see again this other question for more details on the method if you are not familiar with it), it is not hard to show that $$ \sum_{j=1}^n j \ln j = \int_1^n x\ln x dx + O(n) = \frac{1}{2}n^2 \ln n + O(n^2). \tag{3} $$

Combining (2) and (3) into (1), we get $$ \sum_{i=1}^n \ln(i!) = n^2\ln n - \frac{1}{2}n^2 \ln n + O(n^2) = \frac{1}{2}n^2 \ln n + O(n^2). $$


$(\dagger)$ Actually, your original approximation is true, because of the theorem below. However, you may want to cite, or justify it before making such approximations — if you do not know how or why they hold, there is a very high probability you'll end up making a mistake very often.

Theorem. Let $(a_n)_n, (b_n)_n$ be two non-negative sequences such that $a_n\sim_{n\to\infty} b_n$. Then the series $\sum\limits_n a_n$ converges iff the series $\sum\limits_n b_n$ converges; moreover:

  • If the series converge, then $\sum\limits_{n=N}^\infty a_n \sim_{N\to\infty} \sum\limits_{n=N}^\infty b_n$ (the remainders are equivalent);
  • If the series diverge, then $\sum\limits_{n=1}^N a_n \sim_{N\to\infty} \sum\limits_{n=1}^N b_n$ (the partial sums are equivalent).
2
On

Your estimate is wrong since there is no reason to go from $$\sum_{i=1}^n i \log i$$ to $$\frac{n(n+1)}{2}\sum_{i=1}^n \log i$$ these two functions are not asymptotically equivalent. A good idea is the following: approximate the sum with an integral $$\sum_{i=1}^n i \log i = \int_1^n x \log x \ \mathrm{d}x + O(n \log n)$$ The integral can be computed exactly by integrating by parts $$\int_1^n x \log x \ \mathrm{d}x = \frac{1}{2} n^2 \log n -\frac{1}{4} (n^2-1) $$ so that the final estimate is $$\sum_{i=1}^n i \log i \sim \frac{1}{2} n^2 \log n = O( n^2 \log n)$$

0
On

$$\sum_{i=1}^{n}\log(i!)=\sum_{i=1}^{n}\sum_{j=1}^{i}\log(j) = \sum_{j=1}^{n}(n+1-j)\log(j)=\sum_{k=1}^{n}k\log(n+1-k) \tag{1}$$ is straightforward to study through Abel's summation formula:

$$\sum_{k=1}^{n}k\log(n+1-k) = \int_{1}^{n}\frac{\lfloor x\rfloor(\lfloor x\rfloor+1)}{2(n+1-x)}\,dx \tag{2} $$ from which $$\boxed{ \sum_{i=1}^{n}\log(i!) = \frac{n^2}{2}\log(n)-\frac{3n^2}{4}+O(n\log n)}\tag{3}$$ easily follows.