Say you have a convergent series L = $\sum_{n=0}^\infty a_n < \infty $
Why is $\sum_{n=0}^\infty \frac{1}{a_n}$ not convergent? I mean say L $\in$ ℝ then shouldn't 1/L also be $\in$ ℝ? What steps to I have to make in my head to find this out by myself?
Thank you
You seem to be assuming that $\sum_{n=0}^{\infty}\frac{1}{a_n} = \frac{1}{\sum_{n=0}^{\infty}a_n}$, but this is completely false.
In general, $\frac{1}{a+b}\neq\frac1a+\frac1b$, and $\frac{1}{a+b+c}\neq\frac1a+\frac1b+\frac1c$, and so on. So this obviously won't work for infinite sums either.
In fact, if one of $\sum_{n=0}^{\infty}\frac{1}{a_n}$ and $\sum_{n=0}^{\infty}a_n$ converges to a finite value the other must diverge, because for a series to converge the terms have to converge to 0, and you can't have $a_n$ and $\frac{1}{a_n}$ both converging to 0.
Infinite summation is linear: If $\sum_{n=0}^{\infty}a_n=a$ then $\sum_{n=0}^{\infty}ca_n=ca$, and if also $\sum_{n=0}^{\infty}b_n=b$ then $\sum_{n=0}^{\infty}(a_n+b_n)=a+b$, but that's basically it. For an arbitrary function $f$ you don't have $\sum_{n=0}^{\infty}f(a_n)=f(a)$. For example, in general $\sum_{n=0}^{\infty}a_n^2\neq a^2$, for the same reason that in general $(a+b)^2\neq a^2+b^2$ and $(a+b+c)^2\neq a^2+b^2+c^2$.
If you're only interested in whether the series converges and not what is its sum, there are plenty of convergence tests to use, but here too you need to be careful and not make things up.