My skills in analysis are not what they used to be, but in the last hour or so, I decided to solve some problem that i concieved.
Suppose that we choose some function $f$ from the set of all real functions of a real variable that are:
1) continuous
2) strictly decreasing
3) strictly positive
4) have that $\lim_{n \to + \infty} f(n)=0$
5) have that $\sum_{n=1}^{+ \infty} f(n)$ is convergent.
Now, an idea is to think is the domain of convergence extendable to some interval $(1-\alpha,1+\alpha)$ for some $\alpha>0$ (of course, this $\alpha$ depends on which $f$ we choose).
We are looking at an interval $(1-\alpha,1+\alpha)$ because $\sum_{n=1}^{+ \infty} f(n)=\sum_{n=1}^{+ \infty} (f(n))^{1}$, so, we just want to know are 1) and 2) and 3) and 4) and 5) enough to guarantee to us that convergence at $1$ is just some glimpse of a convergence on some $\alpha$-neighborhood of $1$, that is, that convergence at just one point implies convergence on some open interval.
I am really not sure do I have a proof that this is really the case, I mean, I have some writings on a paper which really prove that a convergence is extendable from a point to an open interval, but some steps in my "proof" are not, in my opinion, rigorously established, or at least, not rigorously enough.
So, I would like to see how would you prove this extendability, if it is really possible under those five conditions on $f$?
If extendability under these five conditions is not always possible, do you have an example of such an $f$ for which we cannot extend convergence from a point to an interval?
Let $S$ be the function space you defined. DanielWainfleet's comment is a good point. I'll consider three cases for clarity. My guess is you meant $(1)$ because it's more complicated.
$(1)$ When you perturb the exponent, i.e. we consider the question $\forall f \in S, \exists \epsilon > 0$ such that $\forall 0 \leq \alpha < \epsilon$ $$\sum_{n=1}^{\infty} f(n)^{1 \pm \alpha} < \infty?$$. $(2)$ perturbing the domain value, i.e. we consider the question $\forall f \in S, \exists \epsilon > 0$ such that $\forall 0 \leq \alpha < \epsilon$ $$\sum_{n=1}^{\infty} f(n \pm \alpha) < \infty?$$. $(3)$ Perturbing both simultaneously, i.e. the question $\forall f \in S, \exists \epsilon > 0, \delta > 0$ such that $\forall 0 \leq \alpha < \epsilon, 0 \leq \beta < \delta$ $$\sum_{n=1}^{\infty} f(n \pm \alpha)^{1 \pm \beta} < \infty?$$
I claim that $(1)$ is false with the following counterexample. Define $f(1) = f(2) + 1$ where for $x \geq 2$ we have $f(x) = \frac{1}{x (\ln(x))^2}$. Interpolate on the interval $[1,2]$ to make $f$ continuous and strictly decreasing on this interval (for example, piecewise linear interpolation would work). Note that $f$ is then strictly decreasing, strictly positive, and continuous. It then follows by the integral test that $$\sum_{n=1}^{\infty} f(n) < \infty$$. For $0 < \alpha < 1$, we have $$\frac{d}{dx}\left(f(x)^{1-\alpha}\right) = (1-\alpha)f(x)f'(x) < 0$$ since $1-\alpha > 0$, $f(x) > 0$, and $f'(x) < 0$. Therefore, we may apply the integral test to conclude for any choice of $0 < \alpha < 1$, $$\sum_{n=1}^{\infty} f(n)^{1-\alpha} = \infty$$. Thus $(1)$ is false.
$(2)$ is true. Set $\alpha \in \mathbb{R}$ arbitrary and use that $f$ is decreasing.
$(3)$ is false. $\alpha = 0$ will not work due to $(1)$ since this value must be considered regardless of choice of $\epsilon$, so no extension is possible.