I have a seemingly trivial problem that I cannot seem to figure out.
Imagine I want to prove that the series of $n$ does not converge, and for that I use the d'Alembert rule (yes it's ridiculous but it's for my example) I learned in school that if $(U_n)$ is a strictly positive sequence and
$$ \lim_{n \to \infty} \frac{U_{n+1}}{U_n} =1^{+} $$
then the series of $(U_n)$ is divergent.
So here is my problem: what's the result of $$\lim_{n \to \infty}\frac{n+1}{n},$$
is it $1$ or $1^{+}$ ?
(Because if the limit is one we can't conclude and if the limit is $1^{+}$ we can conclude that the series of $n$ is convergent.)
It's important to understand that $$\lim \limits_{n \to \infty} \left(\dfrac{U_{n+1}}{U_n}\right) =1^{+}$$ is an abbreviation of
$$\lim \limits_{n \to \infty} \left(\dfrac{U_{n+1}}{U_n}\right) =1\land \exists p\in \mathbb N\forall n\in \mathbb N\left(n\ge p\implies \dfrac{U_{n+1}}{U_n}\ge 1\right).$$
In view of this, applying this criterion to the series $\sum \limits_{n=0}^\infty\left(n\right)$ should offer no problem.
Obviously $\lim \limits_{n\to \infty}\left(\dfrac{n+1}n\right)=1$ and setting $p=1$ it also trivially holds that $$\forall n\in \mathbb N\left(n\ge 1\implies \dfrac{n+1}{n}\ge 1\right).$$