Let $(X_i)_{i\geq1}$ i.i.d in $\mathcal{L}^1(\Omega,\mathcal{F},p)$ Is it true that
$E(X_j|\sum_{i=1}^nX_i)=\frac{1}{n}\sum_{i=1}^nX_i$
For each $j$ where $1\leq j \leq n$.
I think it is true, because given the information of the sum the best forecast for $X_j$ is the mean value.
I wonder if this can be proven formally using the defining relation of the conditional expected value $(i.e \quad E(E(X|\mathcal{G})I_A)=E(XI_A))$ where $A$ is any set in $\mathcal{G}$ and $\mathcal{G}$ is a sub $\sigma-$algebra of $\mathcal{F}$.
Let $S_n = \sum\limits_{i=1}^n X_i$
Since for all $j:1\leq j\leq n$, the random variables $X_j$ are independent and identically distributed, then $\mathsf E(X_j\mid S_n)$ all have the same values. It is a matter of symmetry. $$\begin{align}\mathsf E(X_j\mid S_n) &= \tfrac 1n\sum_{i=1}^n\mathsf E(X_i\mid S_n) &&\text{Symmetry, }\forall j\in\{1..n\}\\[1ex] & = \tfrac 1n\mathsf E(\sum_{i=1}^n X_i\mid S_n) && \text{Linearity of Expectation}\\[1ex] &= \tfrac 1n \mathsf E(S_n\mid S_n) && \text{by definition of } S_n\\[1ex] & = \tfrac 1n S_n &&\text{clearly }\mathsf E(S_n\mid S_n)=S_n \\[2ex]\therefore\quad\mathsf E(X_j\mid \sum_{i=1}^n X_i) & = \tfrac 1n\sum_{i=1}^n X_i&&\text{when }{(X_j)}_{j\in\{1..n\}}\text{ are iid.} \end{align}$$ That is all you need.