I want to prove that if $X_1,X_2$ are i.i.d. random variables then $E[X_1| X_1+X_2] = E[X_2|X_1+X_2]$.
I see that this is intuitive but I think it is by no means trivial yet everybody just states this property as though it was completely obvious. Do I really miss something obvious here?
In my attempt to prove this I didn't get so far. It suffices to show $E[1_AX_1] = E[1_AX_2]$ for $A=\{X_1+X_2\in B\}$ where $B$ is any Borel set. I don't see how this is trivial and would appreciate help to let me see how it works.
The conditional expectation $E[X \mid Y]$ is defined by the property that for every $Z \in \sigma(Y)$, then $$E[XZ] = E[ZE[X \mid Y]]$$
You can think of it as a "projection"[1]; it tells you that in the "space" generated by $Y$ (in this case the sigma algebra $\sigma(Y)$), $X$ and $E[X \mid Y]$ act in the same way on elements of $Y$; like the vectors $(1, 1, 0)$ and $(1,1,4)$ behave in the same way in the $\mathbb R^2$ plane.
Now you know that $$E[X_1Z] = E[Z E[X_1 \mid X_1 + X_2]$$
for every $Z \in \sigma(X_1 + X_2)$. But the left hand side is also equal to $E[X_2Z]$ (since $X_1, X_2$ are iid[2]). Hence
$$E[Z E[X_1 \mid X_1 + X_2]] = E[ZX_2]$$
Looking again at the definition, this tells you that
$$E[X_1 \mid X_1 + X_2] = E[X_2 \mid X_1 + X_2]$$
Another way to do this is looking at the symmetry of the problem; how you going to distinguish $X_1$ and $X_2$? Calling $X_1$ $X_2$ and viceversa leaves everything the same, so clearly the two conditional expectations have to be equal
[1]: Indeed, if we restrict ourselves to functions in $L^2$, then the inner product is precisely $(X,Y) = E[XY]$, so the definition is exactly analogous to the usual definition of projections on euclidean space. Or for elements of Hilbert spaces, for what matter. And in fact $L^2$ is an hilbert space. For functions not in $L^2$, this is a generalization.
[2] It suffice to prove that the joint cdf of $(X_1, X_1 + X_2)$ and $(X_2, X_1 + X_2)$ are the same. As the OP @LeBtz pointed out in the comments, the correct argument is as follows: Since $(X_1, X_2)$ and $(X_2, X_1)$ have the same distribution, it follows that $f(X_1, X_2)$ and $f(X_2, X_1)$ also have the same distribution. Applying it to the function $f(x,y) = (x, x+y)$ we prove the result.