For a problem, I need to show $$\lim_{b \rightarrow 1^-} \sum_{n=0}^{\infty}\frac{(-b)^n}{n+\gamma} = \sum_{n=0}^{\infty}\frac{(-1)^n}{n+\gamma}\hspace{10mm} \forall b \in [0,1]$$
My attempt at doing this is showing that the sum converges uniformly in b: that is, showing that $$\forall \epsilon>0 \hspace{2mm} \exists N\in \mathbb{N} \hspace{2mm} \text{s.t.} \hspace{2mm} \big | \sum_{n>N}\frac{(-b)^n}{n+\gamma} \big | < \epsilon \hspace{4mm} \forall b \in [0,1]$$
I am not sure how to approach this: My main idea is showing that if the above is true for $b=1$, it is also true for $b<1$. However, it is not sufficient to show that the magnitude of each term is less than that with $b=1$, because we might get positive terms decreasing by smaller amount than negative terms, and the alternating series may therefore end up bigger. I would really appreciate some help showing this result!
$|\frac{(-b)^n}{n+\gamma}+\frac{(-b)^{n+1}}{n+1+\gamma}\lt\frac{1}{(n+\gamma)^2}$ independent of $|b|\le 1$. Therefore tail $\lt \epsilon$ for all such $b$.