Is there a fundamental difference in the things we call multiplication and those we call addition?
In a field, both binary operations obey exactly the same rules (commutativity, associativity, identity element, and inverse element [actually this one is the same for all but 1 element: namely $0$]). In a ring, some of the multiplicative rules of a field are relaxed, but it seems like we could just as easily have relaxed the additive rules.
It seems that even the distributive law could also be defined so that addition distributes over multiplication (opposite to the normal way), and thus is only a distinguishing property -- when it even applies -- because we've decided it should be.
Other than the fact that often we require both operations on whatever set of "numbers" we're considering, is there some property of addition that is never shared by multiplication (and vice versa), no matter which generalization of each we choose? That is, can we define the necessary and sufficient conditions for a binary operation to be called "addition" or "multiplication"?
Claude Shannon's master's thesis, a seminal contribution to Boolean algebra and electrical engineering, used the notation of addition and multiplication for the two operations that we now think of as AND (multiplication) and OR (addition), applied to the elements 0 and 1. In this case, not only does multiplication distribute over addition, $x(y+z)=xy+xz$, but addition also distributes over multiplication, $x+yz=(x+y)(x+z)$. (These are Shannon's equations 3a and 3b.)
Boolean algebra is completely symmetric in the two operations -- it doesn't matter which one you call addition and which one you call multiplication.
The usage of the terms "addition" and "multiplication" is like many issues of notation: Authors use whatever seems most natural to them, and as long as it's defined clearly, readers will deal with it.