What axiom or definition says that mathematical operations like +, -, /, and * operate on imaginary numbers?
In the beginning, when there were just Reals, these operations were defined for them. Then, i was created, literally a number whose value is undefined, like e.g. one divided by zero is undefined.
Does anyone know about how mathematical operations' ranges and domains were expanded to include imaginaries?
EDIT: An interesting comment notes a first use of complex numbers where,
those values would cancel in the end.
But can I refute that with, "from an inconsistency, anything is provable"?
A corollary question: Could I define a new number z which is 1/0 and simply begin using it? Seems ludicrous.
The "necessary and sufficient" axioms to define the complex numbers are
$$(a,b)+(a',b')=(a+a',b+b')$$
$$(a,b)\cdot(a',b')=(aa'-bb',ab'+a'b).$$
(Subtraction and division can be defined as the inverses of addition and multiplication, as usual.)
In particular,
$$(a,b)+(0,0)=(a,b)$$ so that $(0,0)$ is the zero and
$$(a,b)\cdot(1,0)=(a,b)$$ so that $(1,0)$ is the unity.
As you can check, $(a,b)$ can also be represented as the expression $a+ib$, where $i$ is a reserved symbol, with the usual computation rules on polynomials (with $i$ seen as the variable). Using this notation, $$(0,1)\cdot(0,1)=(-1,0)$$
translates to the famous
$$i^2=-1.$$
As you can check, the "pair" representation and the "$i$" representation are completely interchangeable. $i$ has a simple geometric interpretation: in a 2D plane, multiplication by $i$ corresponds to a rotation around the origin by a quarter turn.
Note that there are absolutely no undefined operations here.