This is a fact we use very frequently in general mathematics when we write such notations as $1+2+3+4$: since we know that $+$ is commutative and associative, we can just "drop the parentheses" and not worry about order of operations. Of course I believe this, but how does one prove this in full generality? Even stating it is giving me trouble. Here's my attempt:
Assume an operation $\oplus:S\times S\to S$ is provided satisfying $x\oplus y=y\oplus x$ and $x\oplus(y\oplus z)=(x\oplus y)\oplus z$ for all $x,y,z\in S$.
Claim: Given any finite set $\emptyset\subset A\subseteq S$, there exists a unique $z\in S$ such that for any function $f:{\cal P}(A)\to{\cal P}(A)$ which satisfies $\emptyset \subset f(B)\subset B$ for all $B\subseteq A$ with $|B|\ge 2$ and any function $g:{\cal P}(A)\to S$ which satisfies $g(\{x\})=x$ for all $x\in S$ and $g(B)=g(f(B))\oplus g(B-f(B))$ for all $|B|\ge 2$, $g(A)=z$.
The operation $\oplus$ does not necessarily have an identity element, so we do not attempt to define an empty sum. Intuitively, this element $z$ represents the finite sum of the elements in $A$, so if $A=\{1,2,3\}$ and $f(\{1,2,3\})=\{1\}$ and $f(\{2,3\})=\{3\}$, then $$z=g(\{1,2,3\})=g(\{1\})\oplus g(\{2,3\})=g(\{1\})\oplus (g(\{3\})\oplus g(\{2\}))=1\oplus(3\oplus 2).$$ There has got to be a better way to say that, but this is the only way I can think of to capture all the possibilities of parenthesization, and still be amenable to a formal proof. And now that I've stated it, how should I prove it? I suppose I should induct on something, but I've no idea what.
Edit: The goal here is to be able to define an operation $F$ such that $F(\{x_1,\dots,x_n\})=x_1\oplus\cdots\oplus x_n$ and be assured that the operation is well defined and satisfies $F(A\cup B)=F(A)\oplus F(B)$, when $A$ and $B$ are disjoint finite nonempty subsets of $S$.
So you want to prove that if $S$ is a set and $+$ is a binary commutative associative law : $S \times S \to S$, then there is a unique function $\sum$ from non empty finite subsets of $S$ to $S$ satisfying :
- if $x \in S, \sum \{x\} = x$
- if $A \cap B = \emptyset, \sum A \cup B = \sum A + \sum B$.
Prove this by induction of the size of the subset : if $|A| = 1$ then we have no choice.
If $|A| \ge 2$, suppose we can define $\sum A$ in several ways : $A = B_1 \cup B_2 = C_1 \cup C_2$, where the pairs are disjoint. Using the induction hypothesis, the sums of the subsets $B_i$ and $C_j$ are well-defined and we need to show that $\sum B_1 + \sum B_2 = \sum C_1 + \sum C_2$.
Let $D_{ij} = B_i \cap C_j$. Suppose for now that none of them is empty. Then $\sum B_1 = \sum D_{11} + \sum D_{12}$ and so on, and the claim boils down to proving $\forall a,b,c,d \in S, (a+b)+(c+d) = (a+c)+(b+d)$.
And this is easy : $(a+b)+(c+d) = a+(b+(c+d)) = a+((c+d)+b) = a+(c+(d+b)) = (a+c)+(d+b) = (a+c)+(b+d)$.
The cases where one or more of the $D_{ij}$ is empty are done in a similar way (and are even easier). Or you could add an element $\star$ to $S$ and extend $+$ by defining $\star + x = x + \star = x$, prove that $+$ still is commutative and associative, and finally define $\sum \emptyset = \star$