How Do You Know If Mathematical Definition Matches Up With Reality?

826 Views Asked by At

This is probably one of the biggest question I have when learning some mathematics. I always wonder if I have a concept in my head lets say continuity. Lets I want this concept to be able to characterize a function that has "no breaks"; it is completely connected. So lets say I then create the a definition of continuity just as the limit definition of continuity is. And then I make theorems, lemmas, etc off of that definition. So after I am done what I learned from these theorems I can now see is implied to any function I have that has no breaks.

My question is how am I to be sure whenever I make definition that it matches up with the concept I am thinking. Like in the example above if for some reason my definition didn't imply my concept of continuity I was trying to form the new knowledge I have from my proofs would be falsely applied to concepts that it wasn't actually talking.

This goes to my overall problem with how am I to be sure the mathematical definition match up the intuition/concepts I am thinking of in reality. One way I thought to reassure my self if mathematical definition matches up with reality is think of attributes of my idea as prove my definition implies these attributes. Also I am not saying that for my example that every continuous function has to be thought as one with no breaks but at the very least it must imply that.

This also becomes more critical to me when the definitions become less intuitive and more abstract

3

There are 3 best solutions below

8
On BEST ANSWER

The point of mathematics is not to match to reality.

This might have been the one of the origins of mathematics, but it has grown and evolved long beyond that point. Mathematics is its own universe, and it deals with assumptions and consequences.

How do you know that mathematical definitions match reality? You don't. If you are in fact trying to model real world settings, then you start with one set of definitions, and you try some test cases. If your definitions fit, then you continue, if not then you change them. And the process continues on and on.

But mathematics, in general, is no longer dealing with modeling reality. If it were, there were no infinite sets; no infinite objects; nothing larger than $2^{100000000000}$; no fractions; no irrational numbers; no negative numbers; absolutely nothing but $1,2,3,\ldots,n$ for some $n<2^{100000000000}$.

And besides, how do you know that your perception of reality is a good model of the actual reality?

0
On

This is not meant to be authoritative in any way, but it is my experience thus far. When learning math you often come in with intuitive concepts (such as continuity) and are taught the way that mathematicians have agreed to formalize those concepts (i.e. limits). This doesn't usually appear completely intuitive, and certainly the time it took to make Calculus more rigorous shows that it was not immediately apparent how to create the formal definitions which adequately captured the intuitive ideas. When learning existing concepts and definitions which are nearly universally accepted, I usually play with examples to see how the definitions best capture the intuition, and strive to find the counterexamples which delimit the usefulness of that definition's scope.

You mentioned reality in your question, but went on to give an example which many people may tell you has less reality to it than one would think - most physical things are not truly continuous, although it is often simplest to model them that way. That being said, I would like to split the remainder of my answer into a paragraph which discusses capturing "reality" as Mathematician's view it (i.e. reality of ideas and concepts currently undefined) and "reality" as most people would see it (i.e. physical reality).

When creating new definitions to encompass a concept, Mathematicians usually work with the concept extensively until they do begin to (hopefully) develop some intuition about it. At this point, many Mathematicians may have varying viewpoints on what aspects are important, and many proposed definitions are independently put forth. Indeed, one does not have to venture far from continuity to see other examples of definitions which may have well taken the place of importance we currently place on continuity, for instance the page on the Hölder condition contains a nice chain of inclusions for varying degrees of "smoothness" that we could expect for a function. Part of what is at play here is a sort of evolutionary aspect, where useful definitions are reused more frequently and eventually become universal. It is a delicate balance between generality and usefulness of the definition which eventually becomes the deciding factor.

When considering whether Mathematics accurately models "reality" in a physical sense, the question becomes more suited for other disciplines, but there is still certainly a significant question there. I was personally impressed by the level of detail given to this topic in Feynman's Lectures on Physics. Feynman introduces essentially the mathematical definition of a vector in physical terms, and then whenever introducing a new physical term which is a vector (i.e. velocity) he goes into great detail to show that it indeed does obey the axioms expected of a vector. (Read this online here.)

0
On

"This also becomes more critical to me when the definitions become less intuitive and more abstract."

I think a sort of platonic reality must be the litmus test for definitions of more abstract phenomena which are not observable physics. For instance, I question if cardinality has been defined properly within the context of ZF when we are unable to decide whether $2^\omega <2^{\omega_1}$ without basically adding it as an axiom. That is, we are unable to decide whether a larger set has more subsets. Absurd.