Could the multiplication be a not well defined operation?

59 Views Asked by At

I was thinking about the fundamentals of arithmetic, and in particular about the concept of number. I'm not an expert of the history of math, but I suppose that all started when one realized that every world's physical object could be transposed into a graphic symbol. The symbol we have been using for ages for this purpose is "$1$". Then I think that people understood that instead to write each time long sequences of $1$ to indicate a certain group of identical objects, it was faster and usefull to use other symbols. So for example, to point at three trees you could write "$3$" instead of "$1-1-1$". You can think to a sort of table which represents the association between a rudimentary way to consider groups of object (on the left) and a clever one (on the right): $$1-1 \rightarrow 2$$ $$1-1-1 \rightarrow 3$$ $$1-1-1-1 \rightarrow 4$$ $$...$$ Using this simple table you can define and understand basic operations. Let's start with the addition, and you get as an example $3+4=7$. $3$ corresponds to "$1-1-1$" and 4 corresponds to "$1-1-1-1$". If you define the sum as the operation in which you consider two sequences together as an unique one, then you'll get "$1-1-1-1-1-1-1$" $\rightarrow 7$.
You can follow a similar process to define the subtraction, you get now as an example $7-3=4$. In this case you'll have to write the two corresponding sequences one above the other like this:

"$1-1-1-1-1-1-1$"($\rightarrow 7$)
"$1-1-1$"($\rightarrow 3$)

Considering the difference you get "$1-1-1-1$" ($\rightarrow 4$).
Now finally the multiplication (and the issue). You consider in this case the example $3*2=2*3=6$. Write the corresponding sequences: "$1-1-1$" and "$1-1$". How you can see, now it's different from the previous operations since you are not able to get the result straight from the two sequences. Someone could say that you just have to get one sequence, repeat it two or three times (it depends on which you choose) and sum all. The problem is that technically by following that process you don't use the concept of number as we have previously defined. I mean, when you say "repeat it two or three times" those "two" and "three" are not referred to an elementary sequence of one, but to a number. It is a bit subtle but it makes sense. What I'm trying to say in other words, is that the repetition of a number, defined as sequence of a basis symbol, can't be done using another number. Now the options are two: the first one is that when you write a multiplication you are not writing numbers (one of the two factor can't be a number defined as above while the other one is), and the second one is that a number is defined in a more general way. If the last option is the correct one, I'd like to know how to define a number, because I have no idea.

1

There are 1 best solutions below

0
On

You should read about the Peano axioms. Those are normally used to defined natural numbers. The article also shows how to define addition and multiplication of natural numbers.