Is binary or hexadecimal better than decimal system

3.3k Views Asked by At

Human have been evolved to use decimal. But I have two views that say that either decimal or hexadecimal would be better

  1. Hexadecimal would be better as it would take less digits in hexadecimal than in decimal.
  2. Binary would be better as it is simple and less complex.

If we started calculating in hexadecimal or decimal would it save save time? Which one of these is correct or are they both wrong.

Thanks

5

There are 5 best solutions below

0
On BEST ANSWER

They're all excellent for their own uses. There is no number system which is objectively "better" than any other.

When working directly with computers and low-level programming, number systems based on powers of $2$ are easier to use because they translate easier to bits of $0$ and $1$.

If you want to talk about numbers to members of the general public, you should use base ten, because otherwise you run a great risk of not being understood properly.

If you're looking at a watch, you should think in base $60$, you never think that $10.31$ hours have passed since midnight; you think that the time is $10:18:36$.

The time and space you save by being "more efficient" isn't worth the confusion you cause by picking the wrong system for the situation.

0
On

Each have their own place in numbers. We could also consider the otcal(base 8). Octal (base-8) and hexadecimal (base-16) numbers are a reasonable compromise between the binary (base-2) system computers use and decimal (base-10) system most humans use.

Computers aren't good at multiple symbols, thus base 2 (where you only have 2 symbols) is suitable for them while longer strings ,numbers with more digits, are less of a problem. Humans are very good with multiple symbols, but aren't that good in remembering longer strings.

Octal and hex use the human advantage that they can work with lots of symbols while it is still easily convertible back and forth between binary, because every hex digit represents 4 binary digits ($16=2^4$) and every octal digit represents 3 ($8=2^3$) I think hex wins over octal because it can easily be used to represent bytes and 16/32/64-bit numbers.

0
On

The largest attested number system used by primitive humans is base 120, by the germanics. Even in the earliest language recorded (Gothic), there is a word that translates to 'decimal' (in the sense of ten-count), since without this, a hundred is read as 120, but a hundred decimal is 100 only.

I have been using this number system since the 1970s. The necessary processes to do long arithmetic in this base, I devised in 1988. In this form it is not much more complex than something like decimal, and easier than base 60. The algorithm makes good use of a smaller difference between the factors: 12-10 = 2, while 10-6=4, makes the latter harder.

Bases like prime powers have a technical use, but outside of this, they are not much use at all.

0
On

The decimal system works well and is the system most people are most comfortable with so in that sense it is the best for humans. This does not mean that people can't think in binary or hexadecimal just that to most of us it feels unnatural.

Computers work by moving charges around and so each bit of information is ether a charge or not a charge giving us two states of information either '1' or '0' so computers in effect 'think' in binary. Octal and Hexadecimal systems are simply a convenience because each Octal digit represents 3 binary bits and each Hexadecimal digit represents 4 binary bits to allow us to write the numbers in less characters. Hexadecimal is now more popular than octal because most modern computer systems store and process data in words that are some integer multiple of 4 bits.

There are times however when we want computers to think in decimal for example money. You can not represent £0.01 easily in binary unless you treat all currency as integers with the least significant bit being one penny and worry about displaying the decimal point in the correct place later. Otherwise you risk getting your sums wrong and several penny transactions on the computer don't give the expected result due to rounding errors. This is less efficient in terms of storage for the computer but ensures you get the answer you expect.

So there is no best system: it depends on what you use it for.

0
On

For computers binary is best, hexadecimal is good for computer programmers, but decimal is good for everyone else.

There have been flirtations with ternary computers, and some of the earliest computers Donald Knuth used were decimal computers. But binary has clearly won out as far as computers are concerned.

The problem with binary is that small numbers (like $1729$) have too many digits. But programmers need to mediate between binary and decimal. So hexadecimal is the perfect go-between. With a few exceptions, every program compiler or interpreter can understand hexadecimal.

As for humans, we've been using decimal for what, five centuries? Would the effort to change be justified by switching to another base of numeration? And if we're really going to tilt at windmills, I suggest we do it for duodecimal instead of hexadecimal.