Doubt on application of convergence test on a series

36 Views Asked by At

Today I found the following theorem on a book by Stewart:

If $f(x)$ is continuous, positive and decreasing in $[1, \infty]$, and $f(n)=a_n$, then

$i)$ If $\int_1^\infty f(x)$ converges, then $\sum_{n=1}^\infty a_n$ converges;

$ii)$ If $\int_1^\infty f(x)$ diverges, then $\sum_{n=1}^\infty a_n$ diverges;

I don't think I had a problem understanding the theorem nor its demonstration. But then, on an example, Stewart applies the theorem to the following series:

$$\sum_{n=1}^\infty \frac{\ln n}{n} $$

Take $f(x)=\frac{\ln x}{x}$. It is easy to see that $f(x)$ is continuous and positive, but it is not decreasing in $[1, \infty)$, because

$f'(x)=\frac{1-\ln x}{x^2}<0$ when $1<\ln x \implies f'(x)<0$ $\forall x\in (e, \infty)$.

In other words, $f(x)$ is not decreasing from $1$ to $e$, and therefore it's not decreasing from $1$ to $\infty$. But Stewart, stating this facts himself, continues saying: "we then infer that $f(x)$ is decreasing when $x>e$ and thus we apply the integral test:

$$\int_1 ^\infty f(x) dx=...$$

and because the integral diverges the serie diverges".

Why can Stewart apply the theorem to this series when one of the conditions, that the function is decreasing in $[1, \infty)$, is not met? What have I not understood?

2

There are 2 best solutions below

3
On BEST ANSWER

He can apply it because the behaviour at $\infty$ of a series is not changed if you change the value of a finite number of terms. In other words, the criterion is valid if the function is ultimately decreasing.

0
On

If we know that $f$ is decreasing on $[e,\infty)$ then the theorem shows that $\sum_{n=3}^\infty f(n)$ converges if and only if $\int_3^\infty f(t)\, dt$ converges. It's clear that $\sum_1^\infty$ converges if and only if $\sum_3^\infty$ converges, and similarly for $\int_1^\infty$ versus $\int_3^\infty$.