I've been studying sequences and series recently. As I understood, the sequence convergence is determined whether the sequence has a limit value. Now, in this example
$$a_n = \frac{3n^2 - 5n + 7}{3n^3 - 5n + 7}$$
I get $\lim_{n\to \infty} \frac{1}{n} = 0$, which means this sequence converges. What confuses me is that it is known that series $\sum \frac{1}{n}$ diverges. My question is, is it possible that $\frac{1}{n}$ converges when working with sequences, but diverges when working with series? Or it diverges in both cases?
Thank you in advance
If a series converges, its general term must converge to zero. Indeed, if $s_{0} = 0$, we have: \begin{align*} s = \sum_{n=1}^{\infty}a_{n} \Longrightarrow s_{n} - s_{n-1} = a_{n} \Longrightarrow \lim_{n\rightarrow\infty} a_{n} & = \lim_{n\rightarrow\infty} (s_{n}-s_{n-1})\\ & = \lim_{n\rightarrow\infty} s_{n} - \lim_{n\rightarrow\infty} s_{n-1} = s - s = 0 \end{align*}
However, the converse is not true. As you have noticed, according to the integral criterion, the harmonic series diverges \begin{align*} \int_{1}^{\infty}\frac{\mathrm{d}x}{x} = +\infty \Longrightarrow \sum_{n=1}^{\infty}\frac{1}{n} = +\infty \quad \text{but}\quad\lim_{n\rightarrow\infty}\frac{1}{n} = 0 \end{align*}
Is this what you are asking for?