It is rather obvious that $\sum_{n=1}^\infty |a_n|^\beta<\infty$ for some $0<\beta<1$ implies that $|a_n|^\beta<1$ for almost all $n$.
Hence, $|a_n|^\beta \geq |a_n|$ for almost all $n$. Therefore $\sum_{n=1}^\infty |a_n|^\beta<\infty$ implies $\sum_{n=1}^\infty |a_n|<\infty$.
My question is whether we can get any quantitative results? Intuitively, the convergence of the first series implies "faster" convergence of the second series. Can we formalize this somehow? Is the rate of convergence affected?
I am wondering about this because I read a book on harmonic analysis in which the convergence of $\sum_{n=1}^\infty |f_n|^\beta<\infty$ is studied where $f_n$ are the Fourier coefficients of some function $f$. For $\beta=1$ this is clearly interesting because it implies uniform convergence of the Fourier series but the book also gives results for $\beta<1$ and I am unsure how this is relevant to the actual approximation theory? Does the convergence of this series imply any "better" qualities for the Fourier approximation? If not, why are these results even mentioned?
For refrence the book is "Commutative Harmonic Analysis IV" by V.P. Khavin and N.K. Nikol'skiǐ.
If this is a common question studied within harmonic analysis I would be more then thankful for any book or paper refrences, my search so far has only come up with convergence results for above series under certain conditions. I havn't found any explanations of what that convergence actually means.
I'll assume throughout that $a_n\ge0$. Define $E_\beta(x) = \sum_{n>x} a_n^\beta$, so that the convergence of $\sum_{n=1}^\infty a_n$ is equivalent to $E_\beta(x) \to 0$ as $x\to\infty$. I interpret your question to be: what can we say about $E_1(x)$ in terms of $E_\beta(x)$ alone? A slight elaboration of your argument shows that $E_1(x)/E_\beta(x) \to 0$; can we say anything stronger?
In general, I believe the answer is no. (I'm writing this quickly, so double-checking will be welcome.) In other words, given any function $d(x)$ decaying to $0$, no matter how slowly, we should be able to find an example of a sequence $\{a_n\}$ such that $E_1(x)/E_\beta(x) > d(x)$ for all sufficiently large $x$. Indeed, I think an example can be found using a fixed convergent series like $1/n^2$, but spreading the positive terms out by inserting longer and longer strings of $0$s between the positive terms.
So we would need additional information to get better bounds. A natural quantity to consider is $M(x) = \max\{a_n\colon n>x\}$, which must tend to $0$ since $\sum a_n$ converges. (If $a_n$ is decreasing then $M(x)$ is simply $a_{\lfloor x\rfloor+1}$.) It's clear that $$ E_1(x) = \sum_{n>x} a_n = \sum_{n<x} a_n^{1-\beta} a_n^\beta \le M(x)^{1-\beta} \sum_{n<x} a_n^\beta = M(x)^{1-\beta} E_\beta(x), $$ which is a quantitative improvement over the mere $E_1(x)/E_\beta(x) \to 0$. I suspect that nothing better than $E_1(x) \le M(x)^{1-\beta} E_\beta(x)$ (other than perhaps a multiplicative constant) is true in general: it is already the correct order of magnitude when $a_n$ is polynomially decaying and when $a_n$ is exponentially decaying.