I am looking into repeated operations, and it seems really hard to precisely define multiplication.
Of course, for integer $b$ and real number $a$, we use the grade school definition we all know:
$$ab = \underbrace{a + a + a + \cdots + a}_{b\text{ times}}$$
but what about for real numbers $a$ and $b$?
For exponentiating (for integers: repeated multiplication), we have a precise formula to define it, which is easy to derive:
$$a^x = \sum_{n=0}^{\infty} \frac{x^n \left(\ln(a)\right)^n}{n!}$$
which is nice because we only have integer powers in the sum, which we already know how to define:
$$x^n = \underbrace{x \times x\times x \times \cdots \times x}_{n\text{ times}}$$
But this just raises the question of how we define $x \times x$ precisely.
Is there an analogous formula to this for multiplication?
How does the calculator compute multiplication of reals?
Note: According to sources, just approximating multiplication for real numbers uses calculus or numerical methods. I cannot grasp why we need these advanced concepts to precisely define this fundamental operation, especially when comparing it to the simple formula for exponentiation. But I still don’t have a formula yet.
The way we define these fundamental operations depends on how we define (or construct) the real numbers. When we do that, we define what we mean with "adding" and "multiplying" two numbers, and it's done in a certain way that we construct a field, with several other properties. See for example Construction of real numbers.
However, one way you can think we define the multiplication of two arbitrary numbers $a$ and $b$ is by taking sequences $\left\lbrace a_n\right\rbrace$ and $\left\lbrace b_n\right\rbrace$ of rational numbers that approximate $a$ and $b$, respectively (that is, whose limits are $a$ and $b$), and we can define the product $ab$ as $\displaystyle\lim_{n\rightarrow \infty} (a_n b_n) $, which makes sense since the product of two rational numbers can be defined "intuitively".
Of course that it's needed to prove the product is well defined (basically, that you get the same result no matter what sequences you choose, as long as their limits are $a$ and $b$), but that's a way you can think how the product can be defined.
P.S. Note that in order for that to make sense, we must first define what we mean with "real numbers" $a$ and $b$, so what I wrote here is rather a property of the real numbers than a definition of multiplication. I suggest reading about Dedekind cuts and Cauchy sequences.