Let be $f:[1,\infty)\to [0,\infty)$ a monotonically decreasing function, $\int\limits_1^{\infty}f(x)dx$ an improper integral and $\sum\limits_{k=1}^{\infty}f(k)$ a convergent series. For sake of simplicity let's assume that we already know that $\int\limits_1^{t}f(x)dx\leq\sum\limits_{k=1}^{\infty}f(k)$ for all $t\in (1,\infty)$. Our professor showed that $\int\limits_1^{\infty}f(t)dt$ exists in a very strange way which I think is wrong.
Professor's proof:
We assume that $\int\limits_1^{\infty}f(x)dx$ is divergent. So we can conclude that there exists a sequence $\left(\beta_k\right)_{k\in\mathbb{N}}\in(1,\infty)$ with $\lim\limits_{k\to\infty}\beta_k=\infty$ such that the sequence $\left(b_k\right)_{k\in\mathbb{N}}$ with $b_k:=\int\limits_1^{\beta_k}f(x)dx$ is divergent. Let be $\left(\beta_k\right)_{k\in\mathbb{N}}$ monotonically increasing then it follows that $\left(b_k\right)_{k\in\mathbb{N}}$ is also monotonically increasing. So $\left(b_k\right)_{k\in\mathbb{N}}$ is monotone and bounded and hence convergent (due to monotone convergence theorem). This is a contradiction to the statement at the beginning so that $\int\limits_1^{\infty}f(x)dx$ must be convergent.
How is it possible to draw this conclusion if we only look at sequences $\left(\beta_k\right)_{k\in\mathbb{N}}$ that are monotonically increasing? I suspect that the professor has confused divergence with unboundedness which are not equivalent notions.
What if we include all sequences (also the non-monotone ones)? Then we can no longer make use of the monotone convergence theorem because it's not guaranteed that $b_k:=\int\limits_1^{\beta_k}f(x)dx$ remains monotonically increasing. Any help or comments are appreciated!