Is it true that the sum of all numbers equal 0?

7k Views Asked by At

I'm not a mathematician but I'm studying Nothing, so 0 is relevant, and I'm wondering about the fact that numbers seem to be mutually canceling polarities extending from 0, that is 0 = n - n or 0 = (+n) + (-n) and:

...-2, -1, 0, 1, 2...
...-2, -1.5, -1, -0.5, 0, 0.5, 1, 1.5, 2...

So at the very least if you equally extend two collection of numbers from 0 in both directions and sum them it will always equal 0, but I'm not sure what would happen if you have two sets with all the numbers in both directions, as well as an infinite fractional resolution between whole numbers, because then both the sets would be infinite, or negative and positive infinities 0 = (+Infinite) + (-Infinite), that is if that makes sense (not sure how you would define e.x. infinite positive whole numbers).

Also I seem to remember that there is something called "complex numbers" which aren't present on the number line.

So:

  1. Summing a finite set of polarizing whole numbers extending from 0 with a finite resolution between whole numbers will always equal 0.
  2. Is this also true if we include more numbers then the ones on the number line (complex numbers, etc)?
  3. Is this also true for an infinite number of whole numbers with an infinite fractional resolution?
  4. Is "The sum of all numbers equals 0" True?
  5. Is there any formal way to define #1 and #4?

Clarification

The calculation I'm interested in also has these arbitrary restrictions:

  • "all numbers" means all numbers with only one instance of each number within the formula
  • the order of the numbers in the calculation is 0+(+1)+(-1)+(+2)+(-2)... etc.
2

There are 2 best solutions below

1
On BEST ANSWER

1-2. Yes, as long as if some number $a$ is summed $-a$ is also summed the same number of times, the finite sum will always be $0$. This is true for complex numbers as well. Since the sum is finite, this is already defined formally.

3-5. The sum of an infinite set of numbers is only well defined when the sum eventually approaches a finite number. For example, $1 + \frac{1}2 + \frac{1}4 + \frac{1}8 + ...$ gets arbitrarily close to 2, so the sum may be defined rigorously to be exactly 2. Such a series is called "convergent." The sum you want to evaluate, $$1 -1 + 2 - 2 + 3 - 3 + ...$$ does not approach a value, so it is not a well-defined sum in the usual sense (this type of series is called "divergent"). However, there are ways to define such a sum. For example, if $s_n$ is the sum of the first $n$ terms of the series, the limit

$$\lim_{n\rightarrow\infty} \frac{s_0 + s_1 + s_2 + ... + s_n}{n + 1}$$ is known as the Cesáro sum of the series when the limit exists. For your series, this limit still does not exist, so in the Cesáro sense your sum is also divergent. Another method of calculating a divergent sum is to define a function $$ f(s) = \frac{1}{{a_0}^s} + \frac{1}{{a_1}^s} + \frac{1}{{a_2}^s} + ... $$

where the $a_i$ are the $i$th terms of the series one wishes to sum. Complex valued functions such as $f(s)$ may be unambiguously continued to numbers even where a defining sum such as the one for $f(s)$ above does not converge (this is known as the "analytic continuation" of a function). The zeta regularization of a sum is the value of the analytic continuation at $-1$. For your series, the sum $$ 1 - 1 + \frac{1}{2^s} - \frac{1}{2^s} + ... $$ is $0$ whenever it converges and thus the analytic continuation evaluated at $-1$ is $0$, so in the sense of zeta regularization your sum is indeed zero. While for this series, the sum is what is "should" be intuitively, this is not always the case. For example, $1 - 1 + 1 - 1 + ...$ is $\frac{1}2$ in Césaro summation and does not exist in zeta regularization. If you are interested in reading more about summing divergent series, this is a good book.

0
On

Disclaimer You're asking a tough question. It's tough because it really requires some detailed thinking, and perhaps highlights how "common sense" can lead us astray, when thinking about the infinite. What follows is some expository noodling, and then a disappointing answer.


The standard, rigorous approach is the one you'll find in any calculus text; Stewart or Rogawski for example.

To talk about an infinite series $a_0 + a_1 + a_2 + \ldots = \displaystyle \sum_{n = 0}^\infty a_n$, we define the $N$th partial sum $S_N = \displaystyle \sum_{n = 0}^N a_n$ as the sum of the first $N$ terms. The $a_n$ are the things we're adding up, and they're inherently ordered. For you, $a_0 = 0,\ a_1 = 1,\ a_2 = (-1)$, and so on.

Then, to answer the question, "Does $\displaystyle \sum_{n = 0}^\infty a_n$ converge (that is to say, have a nice finite value)?" we have to decide whether the sequence of partial sums $S_N$ converges to a nice finite value. To do this, we'd need to know how to define convergence of a sequence, which isn't too bad, although it is fairly technical.

But the point is, it takes work, and lots of careful reasoning!

Mathematicians have been working with the infinite and the infinitesimal for an extremely long time (thousands of years, in fact) but it was only relatively recently, in the mid-to-late nineteenth century, that we'd worked out all of the philosophical and logical bugs, so to speak, by converting the old reasoning to something considered more rigorous by modern standards. In the 18th century, the great Leonhard Euler may very well have said your series converged (but that's a complete guess, for illustrative purposes only!)

Do note that there are various provisos to everything I'm saying; it's not quite as cut-and-dried as I've presented (see the answer that refers to divergent series, something I know literally nothing about). Mathematics is quite a diverse field populated with many varying opinions and opinionated people :)

For interesting historical bits, see this page for an overview, or George Berkeley's criticism of Newton's calculus, snippets available here. People like George Berkeley are why mathematicians needed to do a logical cleanup on early calculus in the nineteenth century.


Stab at actual answer

The issue here is that, while half the time we do indeed get $0$ after adding up finitely many terms, the other half of the time -- if we add one more term -- we get very very far away from $0$ (as far away as you'd like; when you add $-1000$ for example, the partial sum is $-1000$). Since this happens, then using the standard definition of the various limits involved, we'd have to say the sum does not converge, even with its typical order.

In order to make it convergent, you basically have to stipulate that you're really adding both $n$ and $-n$ at the same step, so your series is as boring as $0 + 0 + 0 + \ldots = 0$.