When we consider calculations at tiniest of scales which number system would be more accurate, when we consider the binary number system( base 2) or the number system we generally use (base 10).
The other way to say it would be if we consider a number system with base 1,2,3,4.... , does it effects the accuracy with which calculations can be done or it doesn't matter at all.
or How does having a different base affects the accuracy of numbers.Can we prove it or determine by how much it differs when the same calculation is done with base 2 and base 10 numbers.
Without rounding, the base doesn't matter at all. With rounding, the base 10 number system is in a sense more general than binary because whenever something can be represented exactly in a finite number of digits in binary, the same is true in decimal, but there are numbers with a finite length base 10 representation that do not have finite binary representations. (This is true whenever one base is a multiple of another, for example whenever something has a finite length representation in base 3 it also has one in base 9, but not vice versa.) Also, if you are rounding to the same number of digits, decimal is far more accurate than binary.
As long as you have enough digits for your application, the difference is insignificant, but if you're looking for economy in length decimal has more bang for the buck.
Essentially, the larger the base, the more accuracy you have for the number of digits after the point. This has to be weighed against the fact that you need more symbols for digits. Because of the simplicity of binary having only two digits it is the easiest to use on a computer. For humans, though, I would say that in virtually any application where approximation is necessary the numbers are printed out in decimal, even if only because this is what people are familiar with.