Divergence of Harmonic Series

282 Views Asked by At

I understand that the divergence of the harmonic series is a classic proof but what I don't understand is the way it seems to contradict standard methods of finding if a series converges or not.

For the series $\Sigma \frac{1}{n}$ as $n$ approaches infinity it would seem that $\frac{1}{n}$ would tend to $0$. We know this isn't the case though but why is this seemingly standard procedure wrong? Is there some extra proof that must be shown other than just inserting $n$ for infinity?

1

There are 1 best solutions below

0
On

If the terms of a series do not approach $0$, then the series diverges.

If the terms of a series do approach $0$, then the series may or may not converge.

The harmonic series is an example of a series that does not converge though the terms approach $0$.

See this Wikipedia article.