From Casella and Berger Statistical Inference 2nd ed, Example 2.4.9.
The author states
It is clear that, for $0< \theta < 1$, $S_\infty=\lim_{n \rightarrow \infty}S_n(\theta)=0$. Since $S_n(\theta)$ is continuous, the convergence is uniform on any closed bounded interval.
I can't see how this conclusion was reached.. Going by the definitions it seems we should fix $\epsilon > 0$ then find $N$ such that $$N>n \implies |(n+1)(1-\theta)^n| < \epsilon \hspace{2em} \text{for all $0 < \theta < 1$}$$
However I am stuck here. The authors wording seems to imply that he invoked a theorem using the continuity of $S_n(\theta)$ ?
Dini's theorem is required: if a monotone sequence of continuous functions converges pointwise on a compact space and if the limit function is also continuous, then the convergence is uniform.
In each compact subinterval of $(0,1)$ the sequence $S_n$ is monotonic for $n$ large enough.