I was reading about how the hindu arabic notation (the notation we use worldwide for numbers) of numbers was benefic for mathematics. Because it makes much easier some operations, such as multiplication, and it facilitated the writing of the numbers. Before the hindu arabic notation it was hard to write and make operations with the greek notation for example.
And I was wondering if would be possible to exist different notations for numbers, or matrix, or functions, or ... that would have different properties than the usual and make some operations easier, with matrices for example.
Does exist some kind of research area about this topic?
I'm not a mathematician so I'm sorry if this question sounds not good.
To give at least a place-holder response:
Yes, issues of algorithmic efficiency, implicitly depending on how-to-represent integers, real numbers, matrices, etc., do play a large role in modern mathematics and computer science (and any science that depends on computations).
Decimals are not wonderful, but they're ok, or at least computer-representation equivalents.
Representation of (real?) numbers is not usually the serious bottleneck in algorithms, so far as I know (but the future may tell us otherwise?)
Ideas such as Fast Fourier Transform do show that the "obvious" ideas of how to compute things are markedly inefficient, especially with regard to scale.
Even the "square and multiply" algorithm for exponentiation is a bit surprising, as it scales up much better than most naive notions.