Convergence of the series $\sum_n \frac{n^{\log(n)}}{\log(n)^n}$

151 Views Asked by At

I have a rough outline of a proof. Is this correct?

$$\sum_{n \geq 2} \frac{n^{\log(n)}}{\log(n)^n} \to a \in \mathbb{R} \iff \sum_{n \geq 2} 2^n \frac{(2^n)^{\log(2^n)}}{\log(2^n)^{2^n}} \to b \in \mathbb{R}$$ via dyadic criterion (or Cauchy condensation test).

Analyzing the right-hand side:

$$\sum_{n \geq 2} 2^n \frac{(2^n)^{\log(2^n)}}{\log(2^n)^{2^n}} = \sum_{n \geq 2} 2^n \frac{(2^n)^{n\log2}}{(n\log2)^{2^n}} = \sum_{n \geq 2} \frac{(2^n)^{n\log2 + 1}}{(n\log2)^{2^n}} = \sum_{n \geq 2} \frac{2^{n^2\log2 + n}}{(n\log2)^{2^n}} = (\ast). $$

We know for some sufficiently large $N$, $n \log 2 \ge 2$ for all $n\ge N$, so we can bound the series above by:

$$(\ast) \leq C_N + \sum_{n \geq N} \frac{2^{n^2\log2 + n}}{2^{2^n}} = \sum_{n \geq N} 2^{n^2\log2 + n - 2^n} = (\ast\ast),$$

where $C_N$ is the finite sum of the original series until $N$. And once again, for sufficiently large $M$, $m^2\log2+m - 2^m \leq -m^2$ for all $m \ge M$. Hence

$$ (\ast\ast) \leq C_{\max(N,M)} + \sum_{n \geq \max(N,M)} 2^{-n^2} < \infty.$$

We then use the $p$-test to conclude that the sequence does converge.

2

There are 2 best solutions below

0
On BEST ANSWER

Let $a_n$ be the general term. Then $\ln a_n = \ln n \ln n -n \ln ( \ln n)$. For $n$ large enough, $(\ln n)^2 < n/2$ and $\ln \ln n > 1$. Thereofore, $\ln a_n < n/2 - n =-n/2$ or $a_n < e^{-n/2}$.

0
On

As Qiaochu Yuan noted in a comment, the Cauchy condensation test is likely overkill for this problem, and makes it harder to see what is going on. A more direct argument begins by writing everything in terms of the natural exponential function, then try to make some estimates.

The Estimate

We have $$ \sum_{n \geq 2} \frac{n^{\log(n)}}{\log(n)^n} = \sum_{n\ge 2} \frac{\mathrm{e}^{\log(n)^2}}{\mathrm{e}^{n\log(\log(n))}} = \sum_{n\ge 2} \exp\left( \log(n)^2 - n \log(\log(n))\right). \tag{1} $$

It can be shown that there is some $N > 0$ such that $\log(n)^2 - n \log(\log(n)) < -n/2$ for all $n > N$ (see below). Since the exponential function is an increasing function, this implies that $$ \exp\left(\log(n)^2 - n \log(\log(n))\right) \le \exp\left(-\frac{n}{2}\right)$$ for all $n > N$. Combined with (1), this implies that $$ \sum_{n \geq 2} \frac{n^{\log(n)}}{\log(n)^n} \le \underbrace{\sum_{n = 2}^{N} \exp\left( \log(n)^2 - n \log(\log(n))\right)}_{=C} + \sum_{n=N}^{\infty} \exp\left( -\frac{n}{2} \right) = C + \sum_{n=N}^{\infty} \exp\left( -\frac{n}{2} \right). $$

But this last series converges by the root test: $$ \limsup_{n\to\infty} \sqrt[n]{\left|\exp\left( -\frac{n}{2} \right) \right|} = \lim_{n\to\infty} \exp\left( -\frac{n}{2} \right)^{1/n} = \lim_{n\to\infty} \exp\left( -\frac{1}{2} \right) < 1. $$

Therefore the original series converges.

Explaining the Estimate

When I teach these kinds of problems, the biggest difficulty that students have at this point is typically deciding what should be done next. Should they attempt to apply one of the convergence tests (e.g. the ratio or root test)? or should they attempt to bound the general summand (either above or below) by something "nice"? If they decide the try bounding the summand, what is the "right" choice of bounding function? The remainder of this answer is an attempt to explain how one might try to reason about this problem.

Roughly speaking, our intuition should be that $\log(n)^2$ grows fairly slowly compared to $n \log(\log(n))$, hence it is reasonable to guess that this series will converge. Thus we should be looking for a "nice" function which gives an upper bound for the general summand, but which shrinks to zero "fast enough" to allow the corresponding series to converge.

The term $n \log(\log(n))$ grows at least as fast as $n$. This is a quick little observation which makes it possible to eliminate the logarithms from this term. Stated more formally, $$ n \log(\log(n)) \ge n \tag{2}$$ for all $n \ge \mathrm{e}^{\mathrm{e}} =: N_1$.

The bound for the other term in the exponential is a little harder to cook up without relying on some gut intuition. My feeling is that I want to bound the general term of the series by something that "looks like" $\exp(-n)$, because I know that the series $\exp(-n)$ is well-behaved. My intuition is that the logarithm function grows very slowly, so for any $\alpha > 0$, I should eventually have $$ \log(n) < n^{\alpha} \implies \log(n)^2 < n^{2\alpha}. \tag{3} $$ This can be made rigorous by looking at the limit $\lim_{n\to\infty} n^{-2\alpha}\log(n)^2$—two applications of L'Hospital's rule (for example) show that this limit is zero, which means that (3) is eventually true. Taking $\alpha = \frac{1}{4}$, this implies that there is some $N_2 > 0$ such that $$ \log(n)^2 < \sqrt{n} \tag{3} $$ for all $n > N_2$. Finally, making a relatively sloppy estimate (remember, the goal is just to show that the general term is bounded by "somethign nice"—obtaining a "sharp" estimate is not at all important right now), I know that $$ \sqrt{n} < \frac{n}{2} \tag{4} $$ whenever $n > 4 =: N_3$.

Let $N = \max\{N_1, N_2, N_3\}$. Then, combining the estimates at (2)–(5), we have that for any $n \ge N$, $$ \log(n)^2 - n \log(\log(n)) \overset{(2)}{\le} \log(n)^2 - n \overset{(4)}{\le} \sqrt{n} - n \overset{(5)}{\le} \frac{n}{2} - n = -\frac{n}{2}, $$ which is the estimate used above.