I'm studying power series and their radius of convergence, and my book says
$$\sum_{k\ge0} x^k$$
has a convergence radius of 1 and doesn't converge in 1 and -1, as opposite to
$$\sum_{k\ge1} \frac {1}{k} x^k$$
which has a convergence radius of 1 too, but converges in -1 as well and diverges in 1.
To my understanding, the latter converges in -1 because we keep adding and subtracting numbers and the sum doesn't ever get much big, while if we have 1 we have the $\sum_{k\ge1} \frac {1}{k}$ number series which is a known diverging one.
Doesn't the same apply to the former series? I mean, if we have $x=1$ it is pretty obvious why it would diverge, but with -1 wouldn't we keep adding and subtracting 1, thus never diverging?
Is it that the infinite jumping between 1 and 0 that the sum would do is not considered convergence, while not being divergence, because a limit does not exist, as opposite to the second series in which the added/subtracted numbers get infinitely smaller thus getting closer and closer to a fixed value?
Using the $n$th term test, a requirement for a convergent series is:
$$\lim_{n\to\infty} a_n = 0$$
If not, then the series diverges. The limit, in this case, the doesn't exist as it oscillates between $1$ and $-1$ (and its partial sums $1$ and $0$), the series diverges:
$$\lim_{n\to\infty} (-1)^n = \text{DNE}$$
Divergence does not necessarily mean becoming really large, it just means it does not approach a value. Divergence essentially means the lack of convergence.