My confusion stems from the following two examples:
Example 1: Consider the series $\displaystyle \sum_{n=1}^{\infty}\frac{x^n}{1+x^n}$. Does this converge uniformly on $[0,a)$ for each $0<a<1$. Answer: Yes. Does this converge uniformly on $[0,1)$? Answer: No
Example 2: Consider the series $\displaystyle \sum_{n=1}^{\infty}\frac{1}{1+n^2x}$. Does this series converge uniformly on $(0,\infty)$? Answer: Yes, because for each $a>0$, the series converges uniformly on $[a,\infty)$.
My confusion is that the first example seems to contradict the reasoning in the second example: That is, to show uniform convergence on an interval, it is sufficient to show uniform convergence on appropriate sub-intervals. Can someone please explain?