The usage of notation $\lim a_n = 1^+$ for "$a_n$ approaches $1$ from above"

162 Views Asked by At

I have a seemingly trivial problem that I cannot seem to figure out.

Imagine I want to prove that the series of $n$ does not converge, and for that I use the d'Alembert rule (yes it's ridiculous but it's for my example) I learned in school that if $(U_n)$ is a strictly positive sequence and

$$ \lim_{n \to \infty} \frac{U_{n+1}}{U_n} =1^{+} $$

then the series of $(U_n)$ is divergent.

So here is my problem: what's the result of $$\lim_{n \to \infty}\frac{n+1}{n},$$

is it $1$ or $1^{+}$ ?

(Because if the limit is one we can't conclude and if the limit is $1^{+}$ we can conclude that the series of $n$ is convergent.)

2

There are 2 best solutions below

1
On BEST ANSWER

It's important to understand that $$\lim \limits_{n \to \infty} \left(\dfrac{U_{n+1}}{U_n}\right) =1^{+}$$ is an abbreviation of

$$\lim \limits_{n \to \infty} \left(\dfrac{U_{n+1}}{U_n}\right) =1\land \exists p\in \mathbb N\forall n\in \mathbb N\left(n\ge p\implies \dfrac{U_{n+1}}{U_n}\ge 1\right).$$

In view of this, applying this criterion to the series $\sum \limits_{n=0}^\infty\left(n\right)$ should offer no problem.

Obviously $\lim \limits_{n\to \infty}\left(\dfrac{n+1}n\right)=1$ and setting $p=1$ it also trivially holds that $$\forall n\in \mathbb N\left(n\ge 1\implies \dfrac{n+1}{n}\ge 1\right).$$

11
On

The limit is equal to $1$, $1^+$ just implies that the limit approaches $1$ from the positive side.