The title is a bit of clickbait, but I think it's justified.
How did I came to ask this question
In programming, many programming languages have concepts of a hierarchy of numerical types. Often times there would be a numerical type to serve as a supertype of integers, reals, rationals and whatever other kind of numbers that particular language happens to have. This supertype is often called just the number, the properties it has are typically entirely at the mercy of the language designers (i.e. they seem to be arbitrary or motivated by language implementation).
At first, I tried to imagine what would be the next possible step in the hierarchy of familiar numerical types: complex <: reals <: rationals <: integers <: natural numbers, but nothing comes to mind. More so, in almost no programming language it is the case that, for example, an integer is a kind of rational or a rational is a kind of integer. But they often share some properties of the number type I mentioned earlier.
The question proper
Searching for different kinds of numbers in mathematics, I came to realize that there are a great many of mathematical objects eventually called "numbers" (of some kind). For example, p-ary numbers, half-integers, infinitesimals, and really lots, lots more. The common properties I could extract from the description are that these objects are typically equipped with two binary operations (from field axioms) and order relation, although sums and products wildly vary in their behaviour. Other properties may eventually appear or disappear, depending on the particularities of the numeric type in question. But this cannot be enough to select only numbers! There are other fields which have all the same properties. Then how do I distinguish numbers from non-numbers?
The answer I came up with so far
So far, the only "definition" I could think of is that numbers are such sets, where properties of set elements are immaterial (as in urelements). This will, however, classify some groups, rings and fields, which aren't usually called "numbers" as numbers, but will definitely exclude fields of matrices, computable functions, polynomials etc. Is this a problematic answer? Is it outright wrong?
There's a very famous discussion in Wittgenstein's Philosophical Investigations where he talks of what have become known as "family resemblance concepts".
In §66, he takes the example of games, and writes
He considers various examples and points out their similarities and differences, and concludes
Wittgenstein goes on to suggest that many concepts are like this: for many concepts X it is a mistake to look for something in common to all and only the Xs. Rather the Xs exhibit a family resemblance (but need share no one distinctive trait in common).
He goes on to give another example -- and this is why the discussion is relevant to the question! -- namely the concept of a number.
Now, if I want, for this or that purpose,
But Wittgenstein goes on to note that I can (and very often do) use a concept so
And similarly for "number": informally, we can and do stretch the concept in various ways in new applications. And there need be no one thread in common to all these applications: it is good enough that our usage has enough family resemblance to prior uses.
Moral: we can come up with restrictive technical definitions of "number" for this or that technical purpose -- but it is misguided to look for a one-off neat definition of "number" in all its mathematical uses. In its informal general use, it is a somewhat messy family resemblance concept.