This is a very basic fact that I am struggling to find a proof for:
Let $X_{1},X_{2},..,X_{n}$ be i.i.d and let $S_{n}=\sum_{i=1}^{n}X_{i}$.
Then $E[X_{1}|S_{n}]=E[X_{2}|S_{n}]$.
The proof I have found takes arbitrary $A \in \mathcal{B}(\mathbb{R})$ and then argues that $$E[X_{1}\mathbb{1}_{A}]=E[X_{2}\mathbb{1}_{A}]$$
Why is that? Just independence and identical distribution can´t be enough: Take $\Omega=\{1,2,3,4\}$, $X_{1}=\mathbb{1}_{\{1,2\}}$ and $X_{2}=\mathbb{1}_{\{1,3\}}$ - they are both identically distributed and independent, but for $Y=X_{1}$ we have $$E[X_{1}Y]=P(\{1,2\}) = \frac{1}{2} \neq E[X_{2}Y]$$ Of course, $Y \notin \sigma(X_{1}+X_{2})$. But that doesn't help me with a precise proof...
I'll do the case where $n = 2$. The key is the symmetry between $X_1$ and $X_2$. But how do we construct a rigorous proof out of this idea?
Let's verify $\mathbb E[X_1 | S] = \mathbb E[X_2 | S]$ using the definition of conditional expectation. We need to prove that for every $\sigma(S)$-measurable set $H$, $$ \int_H X_1(\omega) \ d\mathbb P(\omega) = \int_H X_2(\omega) \ d\mathbb P(\omega).$$
Since $H$ is $\sigma(S)$-measurable, it must be of the form $$ H = \{ \omega \in \Omega : X_1(\omega) + X_2(\omega) \in A\}$$ for some Borel set $A \in \mathcal B(\mathbb R)$.
Let's rewrite this statement in terms of the distribution measures induced by $X_1$ and $X_2$ on $\mathcal B(\mathbb R)$, $$ \mu_{X_1}(A) := \mathbb P(\omega \in\Omega : X_1(\omega) \in A ) \\ \mu_{X_2}(A) := \mathbb P(\omega \in\Omega : X_2(\omega) \in A ) $$ and in terms of the distribution measure induced jointly by $X_1$ and $X_2$ on $\mathcal B(\mathbb R^2)$, $$ \mu_{(X_1, X_2)}(B) := \mathbb P(\omega \in\Omega : (X_1(\omega), X_2(\omega)) \in B). $$
The statement we want to prove is that if $$ B = \{ (x_1, x_2) \in \mathbb R^2 : x_1 + x_2 \in A \}$$ for some $A \in \mathcal B(\mathbb R)$, then $$ \int_B x_1 \ d\mu_{(X_1, X_2)} = \int_B x_2 \ d\mu_{(X_1, X_2)}.$$
Now note that
So using $\mu$ to denote both $\mu_{X_1}$ and $\mu_{X_2}$ (which are equal), our task is to show that $$ \int_B x_1 \ d(\mu \times \mu) = \int_B x_2 \ d(\mu \times \mu).$$
But by Fubini's theorem, $$ \int_B x_1 \ d(\mu \times \mu) = \int_{\mathbb R} \left( \int_{A - x_2} x_1 d\mu(x_1) \right) d\mu(x_2),$$ $$ \int_B x_2 \ d(\mu \times \mu) = \int_{\mathbb R} \left( \int_{A - x_1} x_2 d\mu(x_2) \right) d\mu(x_1),$$ where $$A - c := \{ x \in \mathbb R : x + c \in A \}.$$ The two expressions on the right-hand sides are identical (since $x_1$ and $x_2$ are merely dummy variables), so we're done.
[By the way, in order to use Fubini's theorem legitimately, I assumed that $X_1$ and $X_2$ had finite expectation - which is usually assumed when we talk about conditional expectations.]