I just read a textbook entry that said something like "counting is not measuring so there are no assumptions".... does counting have assumptions?
First, is there an assumption of equal distance between the integers? Does the magnitude difference between 1 and 2 have to be the same magnitude difference as between 11 and 12, in terms of the thing being counted?
Second, do counts have to be independent of each other? For example, lets say I am counting apples but the presence of any apple increases the probability of there being more apples (and this relationship continues to escalate, the more apples there are, the more likely it becomes that there will be subsequent apples).
Is counting measuring or math?
We use a "pure" and "idealised" form of counting: namely, the form of counting which is captured by measuring the cardinality of finite sets. There's no need to think of it in terms of sets if you don't want to, and certainly counting need not be defined in terms of the cardinalities of sets; but it may help you to clarify what counting means in maths if you consider it this way.
The cardinality of a set doesn't change depending on the elements which are in the set; so having apples present can't change the number of oranges. There is no physically meaningful distance between members of an arbitrary finite set, so you don't need to consider magnitude.