Wrong proof...But where is the mistake?

563 Views Asked by At

So I've just watched this wonderful Numberphile video about transcendental numbers.

In the video, the guy shows that

$$e=\sum_{n=0}^\infty\frac{1}{n!}=1+\frac{1}{1}+\frac{1}{1\cdot2}+\frac{1}{1\cdot2\cdot3}+\cdots$$

In the video, he says that if a number can be reduced to zero, it is algebraic.

Now, if we take:

$$1+\frac{1}{1}+\frac{1}{1\cdot2}+\frac{1}{1\cdot2\cdot3}+\cdots$$

and multiply that by 1, we can get:

$$1+1+\frac{1}{2}+\frac{1}{2\cdot3}+\cdots$$ take that and multiply by $2$, we get:

$$2+2+1+\frac{1}{3}+\cdots$$

Next we will get:

$$6+6+3+1+\cdots$$

And so on. Wait, isn't that something that can be reduced to zero?

Obviously there is a mistake somewhere, but I can't seem to find it.

(Just highschool student here - so take it easy :))

EDIT:

When I say reduced to zero I mean - using addition, substraction, multiplying, dividing and raising to a power of a whole number, in order to reduce the number to zero. So in the last equation, we can reduce 6, and 6 and 3 and so on.

2

There are 2 best solutions below

1
On BEST ANSWER

You are not specifying the rules of the game clearly, but whatever they may be, the intention is that the number of steps in the game is finite. Your algorithm, in contrast, is infinite: you need to multiply the series expansion of $e$ by all positive integers in order to make all the terms integral, which doesn't make much sense.

It is somewhat hard, though not impossible, to define "algebraic" along the lines of your game. One definition would be as follows. A "reduction of $x$ to zero" is any computation involving $x$, rational numbers, addition, subtraction, multiplication, division (but not by zero!), and raising to an integer power, whose result is zero. It is non-trivial if there is some other number $y$ so that if you repeat the very same computation then you don't get zero. A number is algebraic if it has a non-trivial reduction to zero.

Using induction you can show that a number $x$ is algebraic if and only if there exist rationals $a_0,\ldots,a_n$, not all zero, so that $$ a_0 + a_1 x + a_2 x^2 + a_3 x^3 + \cdots a_n x^n = 0. $$ Hermite proved that $e$ doesn't satisfy any such equation, and his proof has since been simplified, though there it's still not very very simple.

What does the proof look like? We can start with an easier target: showing that $e$ is not rational. Here the proof is reasonable simple. Suppose that $e = p/q$, and multiply the series expansion of $e$ by $q!$ to get $$ (q-1)! p = \frac{q!}{0!} + \frac{q!}{1!} + \cdots + \frac{q!}{q!} + \frac{1}{q+1} + \frac{1}{(q+1)(q+2)} + \cdots. $$ Since $p,q$ are integers, this implies that $$ \frac{1}{q+1} + \frac{1}{(q+1)(q+2)} + \cdots $$ is an integer. However, $$ 0 < \frac{1}{q+1} + \frac{1}{(q+1)(q+2)} + \cdots < \frac{1}{q+1} + \frac{1}{(q+1)^2} + \cdots = 1/q \leq 1, $$ and we reach a contradiction.

The proof that $e$ isn't algebraic is much more complicated, and can be viewed here as well as in many other places on the web.

1
On

You don't define clearly what "reduced to zero" means. It appears you are at least allowed to multiply by integers and add/subtract integers. With just these operations you can reduce any rational number to zero. If the number is $\frac ab$, you can multiply by $b$ and subtract $a$ to get to zero.

For algebraic numbers, you also need to be allowed to raise to powers and add/subtract previous results. The algebraic numbers are the solutions of polynomials with integer coefficients. For example, $\sqrt 2$ is algebraic as it satisfies $x^2-2=0$ You can reduce it to zero by squaring and subtracting $2$. For a more complicated example, $x^5+2x^4+3x^3+4x^2+5x+6$ has a single real root near $-1.5$ With integer powers, integer multiplies, and integer addition you can reduce it to zero just by following the polynomial.

That dot dot dot at the end of you last expression hides the fact that there are infinitely many terms, so you cannot reduce $e$ to zero with this approach. No matter what you multiply by, there will still be infinitely many fractions off the right end. This does not prove that there is not another way to do it, but in fact there is not. Wikipedia says this was proved by Charles Hermite in 1873.