Most people use base 10; it's obviously the common notation in the modern world.
However, if we could change what became the common notation, would there be a better choice?
I'm aware that it very well may be that there is no intrinsically superior base, but for the purposes of humans, is there a better one?
I've heard from sources such as this and this that base 12 is better, from here that base 8 is better, and, being into computer science, I would say that base 16 is the most handy.
Base 12 does seem to be the most supported non-base 10 number system, mainly due to the following reason pointed out by George Dvorsky:
First and foremost, 12 is a highly composite number — the smallest number with exactly four divisors: 2, 3, 4, and 6 (six if you count 1 and 12). As noted, 10 has only two. Consequently, 12 is much more practical when using fractions — it's easier to divide units of weights and measures into 12 parts, namely halves, thirds, and quarters.
And, on top of that, previous societies considered very advanced used other systems, such as the Mayans using base 20, and the Babylonians using base 60.
So, summarized, my question is: Is there an intrinsically superior base? If not, is there one that would be best for society's purposes? Or does the best base depend on the context it is being used in?
For computer applications, bases like 2, 8 and 16 are obviously the best. Given that a large percentage of numerical data is stored in and processed by computers, these days, one could argue that what's good for computers is good for society.
Of the three I mentioned, I suppose that 8 or 16 would be better than base 2. Having the price of bananas as a binary number in the supermarket wouldn't work too well. Binary numbers are too long, and they all tend to look alike, so they're hard for people to read.
In the world at large (as opposed to the narrower world of mathematics and computers), reading numbers is probably just as important as doing arithmetic with them. Think of speed limit signs on roads, distances of journeys, prices in stores, or temperatures in weather forecasts. These numbers need to be read and understood quickly (by human beings), and I doubt that this would be possible if they were written in binary. We'd no longer be taking advantage of the wonderful human ability to quickly recognize symbols, and it would be a pity to waste that ability just so that we can make computing easier (in my opinion).