When you think about it, our calculators basically do nothing at all except change the syntax of the number that you enter. We enter a number with multiplication, division and other symbols in it and the calculator gives back the same number in a differebt format.
Why do we call them calculators if they don't actually calculate anything?
You seem to assume that we don't care about the presentation of a number. In fact, it's quite the opposite: we care a lot about how a number is presented, and having a tool to switch between various presentations (this is a bit more general than a calculator, on the face of it) or at least to switch from many presentations to one standard kind of presentation (this is what a calculator really does) is extremely useful.
Note, for example, that different notations make different questions easier or harder. For instance, consider $$208403=13\cdot 17\cdot 23\cdot 41.$$ We have decimal notation on the left, and prime factorization on the right. The latter notation is great for answer questions like "Is ---- divisible by $29$?", while the former is much better for comparing two quantities (e.g. which is larger: $13\cdot 17\cdot 23\cdot 41$ or $11\cdot 19 \cdot 29 \cdot 37$?), not to mention adding them (try finding the prime factorization of the sum of two numbers whose prime factorizations you know already - it's annoyingly hard!). Which one is more important? Well, as a matter of historical development we seem to have agreed, as a society, that decimal notation is the most expedient (or at least, sufficiently expedient that picking it as the "standard" presentation is a good idea).
There's also the issue of generalizing presentations - e.g. decimal notation extends in a natural way to all reals, while prime factorization really doesn't. But that's another more complicated can of worms. Let's just say that even when we lay aside the difficulty of answering specific questions, there are serious mathematical differences between different kinds of presentations of numbers.
There are various abstractions of the idea of "standard presentation" of a number - see the concept of normal form in abstract or universal algebra (or even computability theory!), and also very tangentially term models in mathematical logic.
By the way, a similar thing is going on when we solve equations: the statements "$17x+3=12$" and "$x={9\over 17}$" mean the same thing, so in some sense are the same equation; but clearly they're presented in different ways.