I am confused by something I read. Let $\epsilon_n(x) = \sum_{\mu = -\infty}^\infty (x + \mu)^{-n}.$ The book says that $\epsilon_n$ is absolutely convergent for $n \geq 2$ hence, it is obvious that $\epsilon_n$ is periodic of period 1 for such $n.$ However, it then says that the same is true for $\epsilon_1$ as the terms of the series tend to $0$ as $\mu \rightarrow \pm \infty.$
Why do we need absolute convergence over regular convergence? I understand that absolute convergence allows us to add the terms in whatever order we want so we can rearrange the terms of $\epsilon_n(x + 1)$ to match the order of summation of $\epsilon_n(x)$ for $n \geq 2.$ However, is it true that regular convergence does not guarantee periodicity of period 1? This is hard to believe for some reason.
Furthermore, why does the fact that the terms converge to 0 imply that $\epsilon_1$ is periodic of period 1?