While teaching power series in my AP Calculus class, an interesting question came up. I gave an answer that I believe is correct, but I'm not certain why it is correct except "by definition", but I feel like it's a cop-out to say "since it's an interval of convergence it must be an interval" (after all, there is the case where the interval of convergence is just a set with a single element). The question was this:
Does the interval of convergence of a power series (of real variables) have to be an interval, or can it be the union of disjoint intervals? It seems to me that, because the interval is normally found using the geometric series definition or the ratio test, both of which involve solving a less-than inequality of x, that the interval has to be just that, an interval (or just a set with cardinality 1, containing the center of the series only). Any insight you can give me into this (such as counter-examples or a better explanation) would be greatly appreciated. Thanks!
Consider the power series $\sum_{n=1}^\infty a_n(x-a)^n$.
The proof essentially comes down to looking at a geometric series. Indeed, if we rewrite the power series to look like $\sum_{n=1}^\infty\left(a_n^{1/n}(x-a)\right)^n$, we can see that if $a_n^{1/n}\to c$ for some $c\in(0,1)$, then our power series is "similar to" (in a vague sense) $\sum_{n=1}^\infty(cx-ac)^n$, which converges precisely for $|cx-ac|<1$. Doing the algebra yields the inequality $a-\frac{1}{c}<x<a+\frac{1}{c}$ which is exactly the same as the interval $(a-\frac1c,a+\frac1c)$. This yields the values of $x$ for which the series converges absolutely. We must take care on the endpoints, but realize that $[a,b)$ and $(a,b]$ are still intervals.
If $a_n^{1/n}\to c>1$, then the series diverges since $(a_n^{1/n}(x-a))^n\approx (c(x-a))^n$ and $x-a$ is constant (as far as the summing index is concerned).