Let $-1\lt x\lt 1$, $s=\lim_{n\to\infty}s_n$, $\varepsilon \gt 0$ is arbitrary, there exists $N$ such that $n\gt N$ implies $|s-s_n|\gt \frac{\varepsilon}{2}$. Then $$\left|(1-x)\sum_{n=0}^\infty (s_n-s)x^n\right|\le (1-x)\sum_{n=0}^N |s_n-s||x|^n+\frac{\varepsilon}{2}\le \varepsilon$$ if $x\gt 1-\delta$ for some $\delta\gt 0$.
This is based on Rudin's Principles of Mathematical Analysis, pages $174—175$.
Is that still true if we change the condition to $x\lt 1-\delta$? If we take that with the condition $-1\lt x\lt 1$, $1-x$ would still be in the desired range. Why shouldn't it be true, based on the above argument?