I am struggling to understand the following computation:
Let $p \in [0,1]$ and $X_1, X_2, ...$ be i.i.d. Bernoulli random variables with parameter $p$. Thus, $P(X_i = 1) = p = 1 - P(X_i = 0)$. The questions is what $E[X_1|S_n]$ is with $S_n = \sum_{i=1}^{n} X_i$.
It is stated that the answer is $$E[X_1|S_n] = \sum_{k=0}^{n} P(X_1=1|S_n=k) 1_{\{S_n=k\}}$$
However, I don't manage to get their...
The definition we have is for a sigma field $F$ and $A_i \in F$ which are a countable partition of $\Omega$.Then, $$ E[X|F](\omega)=\sum_{i:P(A_i)>0} \frac{1}{P(A_i)} \cdot E[X \cdot 1_{A_i}] \cdot 1_{A_i}(\omega)$$
My attempted explanation was that the sigma field $F$ is created by the random variable $S_n$. Then, most of the definition would make sense, but
(1) I don't understand why the term $\frac{1}{P(A_i)}$ which in my thought equals $\frac{1}{P(S_n=k)} =? \frac{1}{p^k}$ is gone... and
(2) I don't understand why we sum from $k=0$ since for $k=0$ we would have $P(S_n=0)=0$ if $p=1$?
Thanks a million in advance for your help! :-)
Remark: I know that there is a similar question (Conditional expectation for random walks) but this one does not really help me...