The binomial series is $$(1+x)^\alpha= \sum_{k=0}^\infty \binom{\alpha}{k} \ x^k$$ And $$ \binom{\alpha}{k} = \dfrac{\alpha(\alpha-1)(\alpha-2) \cdots (\alpha-k+1)}{k!}$$ I have to prove that if $x=-1$, the series converges for $\alpha \geq 0$ and diverges for $\alpha <0$
I'm confused, because if $x=-1$ we have $$0 =\sum_{k=0}^\infty \binom{\alpha}{k} \ (-1)^k$$ and I don't know how to continue with the proof
If $\alpha$ is restricted to the integers, when $\alpha<0$ and $x=-1$ then every term in the series is positive and all are greater than ro equal to one, so the series diverges. When $\alpha \geqslant 0$ then for $k > \alpha$ each coefficient, $$\left( \alpha \atop k\right) = \frac{\alpha(\alpha-1)\cdots(\alpha-k+1)}{k!}$$ is zero because once $k > \alpha$ there is a zero factor in every coefficient. Thus the terms in the series are all zero for sufficiently large $k$ and the series converges for all values of $x$ - in this case it is obviously just a finite order polynomial.