Zero and the concept of continuity

350 Views Asked by At

This may be eerily similar to this, since the concepts of $0$ and of $\infty$ go hand in hand, but $0$ is more natural and is considered a number. And I am probably missing something very obvious, but here is the question.

With regard to all of the trouble that $0$ causes (singularity in otherwise nice function), which manifest in the physical world (center of black hole in general relativity for example), wouldn't it be simpler to also regard $0$ as a concept, similar to the concept of $\infty$, while having the smallest possible number, 1 unit, be something that is agreed upon among mathematicians, some $\epsilon$. Of course, if we want to refer to the current $0$, we simply say nothing.

This not only avoids the concept of $\infty$ in the computational cases, but allows for the use of the concept of $\infty$ when making an analytical argument. Furthermore, one usual concern, which is differentiation and integration (Calculus), can be thought of as finding the approximation of "two points" toward $0$, while the actual computation is of those "two points" 1 unit away from one another.

It seems to me that mathematics construct an "infinite/continuous expansion" of the "finite/discrete world" that everything lives in for the sake of simplifying analysis and computation. But with the rise in computing power, wouldn't it be appropriate to reconstruct the number system and other important concepts from continuous/infinite toward discrete/finite? I understand that there is the field of discrete mathematics, but it feels very lacking (apology for my ignorance). For example, theorems on discrete functions are far and few between, e.g. we have not figured out analytically the "simple" logistic chaos in the discrete cases.

I apologize if this question seems silly, but it bothers me since I have been thinking for a long time that if we use a discrete system to reformulate the mathematics of relativity and quantum, then we would not have any trouble unifying these grand theories.

2

There are 2 best solutions below

2
On BEST ANSWER

The question is not very focused, but I can see that you put effort into writing it. So here are my 2 cents:

The distinction between "number" and "concept" (like $\infty$) is meaningless, at least nowadays. There is no commonly agreed upon definition of "number" (integer, rational, real, complex, surreal, ordinal ...?) and few mathematicans really care about that anyway. It's like arguing about what constitutes "true metal music" or "a true scotsman" (pun intended). If someone told you that $\infty$ is only a "concept" not a number, they merely meant $\infty \notin \mathbb R$ and said this because they wanted you to take the definition of a limit seriously (i.e. you can't just "plug in" $\infty$).

As for requiring $0\notin \mathbb R$: A lot of mathematics would fail because of that. The reals do no longer form a group under $+$. Most of (applied) linear algebra becomes virtually meaningless. The (/a) definition of a limit becomes fairly pointless. The real line is suddenly punctured and hence no longer connected, etc. etc.

And it doesn't even have any effect on, e.g. the function $x\mapsto \frac 1 x$. This one isn't even defined at $0$ and in fact continuous on its entire domain $(-\infty,0)\cup(0,\infty)$.

Any number $\varepsilon > 0$ is completely abitrary, so most mathematicians would probably feel uncomfortable choosing such a number once and for all. Not to mention there is little to no point to it.

As for modelling physics with only discrete mathematics: This would be a major nuisance and mostly impossible. Nobody can tell you precisely the configuration of particles at a given time. It is infeasable to set up such a detailed model (as a particular good example, take a look at the double pendulum).

Using analysis ("continuous mathematics") for a slightly less detailed model simply works much better practically than using discrete models.

11
On

Actually you are not the first to draw attention to the apparent need "to reconstruct the number system and other important concepts from continuous/infinite toward discrete/finite." For example, below is a (rather lengthy) quote from Opinion 115 by Prof. Doron Zeilberger. (Note, however, that this very interesting opinion is dated 1 April 2011.)

Not even engineering and physics majors (and especially not chemistry and certainly not mathematics majors) need traditional continuous calculus, with its long-winded and tedious definitions and theorems, that are all obsolete in today's digital age. Instead we should teach them discrete calculus. The Fundamental Theorem of Discrete Calculus is much more user-friendly than its continuous namesake, and only takes few lines to prove. Here it is.

Fundamental Theorem of Discrete Calculus: Let $i$ be a fixed integer, and for $n\ge i$, let $S(n):=a(i)+a(i+1)+ ... +a(n)$, then: $S(n)-S(n-1)=a(n)$ .

Proof: $$S(n)-S(n-1)=(a(i)+a(i+1)+ ... +a(n))-(a(i)+a(i+1)+ ... +a(n-1)) \\\mbox{(by definition of $S(n)$ and analogously $S(n-1)$} $$ $$ =((a(i)+a(i+1)+ ... +a(n-1))+a(n))-(a(i)+a(i+1)+ ... +a(n-1)) \\\mbox{(by associativity)} $$ $$ =((a(i)+a(i+1)+ ... +a(n-1))-(a(i)+a(i+1)+ ... +a(n-1)) +a(n) \\\mbox{(by commutativity)} $$ $$ =0+a(n) \\ \mbox{(by the theorem $A-A=0$ for every $A$, applied to $A=a(i)+...+a(n-1)$)} $$ $$ =a(n) \\ \mbox{(by the theorem $0+A=A$ for every $A$, applied to $A=a(n)$).} $$ Q.E.D.