My textbook defines: \begin{equation} \limsup (a_n) = \min\{M ∈ R |\ \ ∃n_0 \ \ ∀n > n_0, a_n ≤ M\}. \end{equation} And it gives an example: Let $a_n$= 1+ $\frac{1}{n}$. Then $\limsup (a_n)$ = 1.
This confuses me - under no circumstances 1 is bigger or equal to an element of the sequence. It's alawys a bit smaller than $a_n$. What's my mistake?
Imagine that $\lim \sup (a_n) = y > 1$ then there exists some $n\geq 1/(y-1)$ such that $1+1/n<y$. This contradicts that $y$ is lower than any $M\in\Bbb R$ which is a supremum of $a_n$.