$1/(1-x)$ series: dividing by zero?

119 Views Asked by At

I am reading a book "A History of Mathematics by Boyer" In the chapter about Euler it states that

"Although on occasion he warned against the risk in working with divergent series, he himself used the binomial series"

$1/(1-x) = 1 + x + x^2 + x^3 ...$ for values of $x\geq 1$.

But what about dividing with zero? Isn't it zero when $x =1$ ?

enter image description here

2

There are 2 best solutions below

4
On

This can't be true. Note that

$$\frac{1}{1 - 2} = -1,$$

but

$$\sum_{k = 0}^\infty \frac{1}{2^k} = 2.$$

Especially, $1/(1 - x)$ is not defined when $x = 1$.

The correct expansion is

$$\frac{1}{1 - x} = \sum_{k = 0}^\infty x^k, \qquad \lvert x \rvert < 1.$$

0
On

From the formula $$ \sum_{n=0}^∞ x^n = \frac{1}{1-x} $$ valid for $|x|< 1$, you can replace $x$ by $\frac{1}{x}$ and you get that for any $|x|>1$ it holds $$ \sum_{n=0}^∞ \frac{1}{x^n} = \frac{1}{1-\frac{1}{x}} = \frac{x}{x-1} $$ In both of these series however, $|x|≠ 1$.