Today I was going through my Analysis Script which my Professor used for his course (meaning he often refers to it) and I found a Lemma called Integralcriteria for convergence of Series.
I read its proof and nodded my head, thinking I understood what was just shown to me until I realized that I once found this criteria already in this famous form on Wikipedia Integral test for convergence
Lemma (As in my Script): Let $\sum a_n$ denote a series. Define $\varphi (x):= a_n $ for all $x \in [n, n+1)$ then $$ \int_0^{+ \infty} \varphi \text{ exists} \iff \sum_{n=0}^{+ \infty} a_n \text{ converges} $$
This seems rather awkward because it doesn't require $a_n$ to be monotone decreasing as it usually does. Therefore it also is a much stronger statement than the one found on Wikipedia and I doubt that it is correct.
The proof however seems convincing to me (layman, only 1 year experience in pure Mathematics):
Proof (As in my Script, with my comments):
"$\implies $" We have $$\int_k^{k+1} \varphi (x) dx = \int_k^{k+1} a_k dx = a_k \\ \implies \int_0^{n+1} \varphi (x) dx = \int_0^1 \varphi (x) dx + \dots + \int_n^{n+1} \varphi (x) dx = a_0 + \dots + a_n = \sum_{i=0}^n a_i $$
Thus if $\int_0^{+ \infty} \varphi < + \infty$ the same will be true for the series.
"$\Longleftarrow$" Let $N:= \max \lbrace n \in \mathbb{N} : n < R \rbrace$ (note that $N$ depends on $R$) then by Chasles relation we know that $$\int_0^R \varphi = \int_0^N \varphi + \int_N^R \varphi $$
For the first term we have $$\int_0^N \varphi = \sum_{n=0}^{N-1} a_n \to \sum_{n=0}^{+ \infty} a_n < + \infty \text{ as } N \to + \infty $$ Because by assumption the Series $\sum a_n$ is convergent, thus we also have $a_n \to 0$ as $n \to + \infty$ such that for second expression we obtain that $$ \left| \int_N^R \varphi \right| \leq \int_N^R |\varphi(x)| dx= |(R-N)a_N| \leq |a_N|$$ Such that as when $R \to + \infty$ the above converges to zero. $\square$
I would appreciate it if someone could tell me the flawed argumentations in the above 'proof' and maybe give an easy counter example (if possible)
Why should $\sum_n a_n$ converging imply $a_n$ is monotonically decreasing? Take for example $a_n = 1/n^2$ for $n$ odd and $1/n^3$ for $n$ even, which decreases, but certainly not monotonically. It seems like you're assuming $a_n\rightarrow 0$ implies $a_n$ is monotonically decreasing.
However, your theorem is different from the actual integral test. For the integral test you are parametrizing a sequence $a_n$ by a function $f(x)$. So for example if you're doing say $\sum_{n=1}^\infty 1/n$, then you pick $f(x)=1/x$ and then argue that the upper and lower riemann sums bound the partial sums of your series, and in this case you'll see that it diverges. The point is you need monotonicity to argue this upper/lower Riemann sum argument. Also a key fact is that $\int_1^N 1/x dx = \ln(N)$, something you can find quickly (see below).
On the other hand your theorem just rewrites the series as an integral of a step function. In this case, the Riemann sums that approximate hte integral are essentially exactly the steps of height $a_n$, so there's no need for monotonicity. Thus it's not a stronger statement, in the sense that it's less useful: the usual integral test takes advantage of the fact that we know how to integrate something like $1/x$ and then evaluate the integral immediately.