Newton's Binomial Theorem with $n<0$

118 Views Asked by At

I was watching this video from Veritasium (https://www.youtube.com/watch?v=gMlf1ELvRzc) where one of the points he brings is applying the Binomial Theorem with $n<0$. To clarify:

$$ (1+x)^n = \frac{1}{0!} + \frac{n}{1!}x + \frac{n(n-1)}{2!}x^2 + \frac{n(n-1)(n-2)}{3!}x^3 + \cdots $$

There is another notation, $(a+b)^n = \sum_{k=0}^n \frac{n!}{k!(n-k)!} a^{n-k}b^k$, but it would only complicate things here.

The argument is that for positive integers $n \in \mathbb{Z}^+$, the sum actually goes on (so it's an infinite series), but at one point, you'd get one term on the numerator that is $(n-n)$ and make every term thereafter equal $0$. However, with negative integer values $n \in \mathbb{Z}_*^-$, you would get an actual infinite series. For instance, for $n=-1$:

$$ (1+x)^{-1} = \frac{1}{0!} + \frac{-1}{1!}x + \frac{-1(-1-1)}{2!}x^2 + \frac{-1(-1-1)(-1-2)}{3!}x^3 + \cdots = 1+(-1)x+1x^2+(-1)x^3+\cdots $$

Here's my question: But what if, in the example above, $x=1$? Then, my sentence would just look like $1-1+1-1+1-1+1-1+1-1+1-1+\cdots$, which alternates back and forth between 0 and 1. And yet, if you look at the original sentence, $(1+x)^n$, then this should be:

$$ (1+x)^n = (1+1)^{-1} = \frac{1}{2} $$

So how is this true? Or am I missing something here?

$$ \sum_{a=0}^\infty (-1)^a = \frac{1}{2} $$

1

There are 1 best solutions below

0
On BEST ANSWER

This is known as Grandi's series. It diverges in the usual sense of convergence (limit of partial sums). However, there are other notions of convergence for infinite series that are more forgiving. For example, using Cesàro summation (limit of arithmetic means), it converges to $\tfrac{1}{2}$.

If this bothers you, you're not alone! But rather than fret, consider this: the fact that your intuition about finite things doesn't extend well to infinite things is not the fault of the infinite things.

Divergent series are weird and wonderful! Remember, that despite speaking of the "sum" of an infinite series, there is no such thing. We can only sum a pair of numbers $a_1+a_2$, then by induction and associativity, we can extend that to a sum of a finite series of numbers $a_1 + \cdots + a_n$. But any notion that extends to an infinite series necessarily involves some sort of limiting process, so it's not really a sum. So, we have choices about how we wish to make such a generalization.