Let $(\Omega, \mathcal{F}, P)$ be a probability space, and let $X_1,X_2,\dots$ be a sequence of i.i.d. random variables. Let $$S_n:=X_1+\cdots+ X_n,$$ and define $\mathcal{G}:=\sigma(S_n)$, the $\sigma$-field generated by the sum $S_n$. Compute $E(X_1\vert S_n)$.
For reference, by definition, $E(X_1\vert S_n):= E(X_1\vert\sigma(S_n))$, where the latter is the a.s. unique $\sigma(S_n)$-measurable random variable such that $$\int_A E(X_1\vert\sigma(S_n))\,dP=\int_A X_1\,dP\quad\text{for all }A\in\sigma(S_n).$$
My question is if the independence of the $X_i$ is unnecessary? This seems to be because my first attempt only requires the fact that they're identically distributed:
$$S_n = E(S_n\vert S_n) = E(X_1\vert S_n) +\cdots + E(X_n\vert S_n) =: Z_1+\cdots Z_n.$$
The first equality is immediate from the definition and the second is from linearity. Then for any $1\le i< j\le n$ and any $A\in\sigma(S_n)$ we have that $Z_i$ and $Z_j$ are $\sigma(S_n)$-measurable and:
$$\int_A Z_i\,dP = \int_A X_i\,dP = \int_A X_j\,dP = \int_A Z_j\,dP,$$
since the $X_i$ are identically distributed and $\sigma(S_n)\subset \mathcal{F}$. Thus by definition we see that $Z_i=Z_j$, so that $E(X_1\vert S_n) = E(X_i\vert S_n)$ for all $i=1,\dots, n$. Hence
$$S_n = n\cdot E(X_1\vert S_n) \implies E(X_1\vert S_n) =\frac{1}{n} S_n.$$
Of course, in most of the equalities I mean almost sure equality. It doesn't seem like independence is necessary, but I want to verify.
I might be late but you need the independence here. You're integrating $X_i$ over an event of the type $A=\{S_n\in B\}$ which involves all $X_1,\dots,X_n$. To exchange $X_i$ and $X_j$ in the integral, you need for example that the vector $(X_1,\dots,X_i,\dots,X_j,\dots,X_n)$ hast the same distribution as $(X_1,\dots,X_j,\dots,X_i,\dots,X_n)$ (and the fact that this does not change the event A, but that's trivial). This only holds for i.i.d random variables. As a counter example consider $X_i\sim N(0,1)$ and $X_j=-X_i$.