Calculus 1 students are taught that some infinite series converge while others do not. However, there are other tricks one can do whereby some of those "diverging" series do in fact converge but in counterintuitive ways. For example,
- extend the formula for computing a geometric series outside of its "radius of convergence" to see strange things like $\sum_{n\in\mathbb{N}}2^n=-1$, or
- play with Taylor series of other functions in clever ways to see things like $\sum_{n\in\mathbb{N}}n=-\frac{1}{12}$.
I am wondering what is a more rigorous way to determine convergence and divergence that includes those examples, as well as others, whereby we could determine with certainty if a series diverges. Is there some theory of this? Searching online, I just get the usual calculus stuff (and doesn't help there's a TV show that comes up).
As an example for the sake of this post, let's take the harmonic series
$$\sum_{n\in\mathbb{N}}\frac{1}{n}.$$
This seems to diverge no matter what. Or ... maybe we just have to wait for someone to come up with some clever sense by which it converges (to, I don't know, maybe a quaternion or something more exotic). Is there a proof of this series' divergence that would fail for those other strange examples? Obviously the comparison test is no longer an option.
Here is a partial answer. The harmonic series occurs as the unique pole of the Riemann zeta function. That is, even if we take the analytic continuation of the mapping $s\mapsto \sum_{n\in\mathbb{N}}n^{-s}$, we find that there is a finite output for any $s\in\mathbb{C}$ ... except for $1$. But I see that as just one choice for how to interpret the series; it doesn't converge via analytic continuation, but this need not be a complex analysis problem.