A lot of textbooks said it was hard for human to accept zero when it was first introduced.
How could it be? It seems to me as natural as positive integer which represent there is no elements at all.
A lot of textbooks said it was hard for human to accept zero when it was first introduced.
How could it be? It seems to me as natural as positive integer which represent there is no elements at all.
On
You can't count out zero 'things'; I would imagine that that was the big issue. Same goes for the negative side of $\mathbb{Z}$.
On
Of course it seems natural to you; you grew up in the modern world, where everyone accepts zero. More importantly, people now accept the abstract concept of numbers and are capable of divorcing them from the things that they represent. This is a sophisticated point of view. From a more naive point of view, a number is a property of a collection of objects: when I say there are $2$ apples somewhere, that is a property of a collection of apples. If there are no apples, then what is there to "hold" the corresponding property? So instead of saying there are $0$ apples, people said there are no apples.
In computer science terms, apples.num() is not defined if there is no apples variable!
You also have to understand that "human" in your statement means (from my understanding) "Western mathematicians." Indian mathematicians had no trouble with zero.
On
When did 'none' come to be a number? My guess is that it was when merchants or scribes first codified symbolic addition and subtraction to calculate quantities of food and supplies on hand. Intuitively, they must have known that 'none' would be the result of subtracting 100 bushels of wheat from same. And that adding 100 bushels to 'none' would give you a measurable quantity. Philosophers may have quibbled about whether 'none' was a real and true number, but I'm sure it was all too real to the merchants and scribes of antiquity.
On
In many places, i.e., accounting 0 meaning nothing is represented by a line.
for example $3.—
for exactly 3 dollars. rather than $3.00
and sometimes by xx ... as written on a cheque:
---- three dollars ----- xx/100
On
Question: What is the name of the first female U.S. president?
Answer: There hasn't been one yet.
It is common to respond to a question by explaining that the question itself is flawed. Because there have been no female U.S. presidents yet, it doesn't make sense to ask for the name of the first one.
Question: How many apples do you have?
Answer: I don't have any apples.
Responding to a "how many" question can follow the same principle: the question is flawed because I don't have any apples, so it doesn't make sense to ask how many I have. It can be very, very difficult to revise one's thinking about something so basic; even today, there are many people who can't understand zero as a number, and instead think of it as simply a computational trick, interpret it as a sort of "not a number", or other similar sort of thing.
The problem was that math started out of necessity. Shepherds needed to count how many sheep they own, parents needed to count how many children they had, etc. For most simple mathematics, zero is simply a place-holder. Who needs to count zero sheep? Even the most basic forms of counting--using your fingers--who has zero fingers? Zero is also pretty closely tied to the concept of negative numbers as well, which are a very non-natural inclusion for mathematics that simply evolve out of necessity. What shepherd in his right mind has -1 sheep. Wrapping your head around the idea of zero is easy once the idea is conveyed, but somewhat non-intuitive to arbitrarily include. Even worse, once you do include zero, how do you deal with all the problems that come along with it such as division by zero. A great history on the story of how we came to understand zero can be found in the book: Zero: The Biography of a Dangerous Idea by Charles Seife.