This question is motivated from my study of Rudin's Real and Complex analysis, page 18-19, "Arithmetic in $[0, \infty].$"
As in Rudin's book, suppose we define $a + \infty=\infty + a=\infty$ if $0\le a\le \infty$, and $$a\cdot \infty=\infty \cdot a = \left\{\begin{array}{ll}\infty, &\textrm{if } 0<a\le\infty\\0, &\textrm{if } a=0.\end{array}\right.$$ Let $\{a_n\}$ and $\{b_n\}$ be two sequences in $[0,\infty]$, with $a_n\to a$ and $b_n\to b,$ where $a$ or $b$ may be $\infty.$ Are the following two statements true?
(1) $a_n+b_n\to a+b,$ always.
(2) $a_nb_n$ may not converge to $ab$ in general. It does, however, if $\{a_n\}$ and $\{b_n\}$ are both increasing.
A counter example to (2) that I can think of is: $a_n=n^2$ and $b_n=\frac{1}{n}$. In this case, $a_n\to a=\infty$, $b_n\to b=0.$ So $ab=0,$ but $a_nb_n=n\to \infty\ne 0.$
I'd appreciate it if someone can confirm, refute, or comment further about it. Thanks a lot!