I've just made a quick search but can't seem to find a satisfying explanation for the following:
Let $X_i,...,X_n$ be $\{-1,1\}$-variables that are not necessarily independent, and $E[X_i]=0$. Then: $$E[e^{\sum_i X_i}]= \prod_i E[e^{X_i}|\{ X_j: j < i \}]$$.
Can I have a derivation and/or the intuitive idea behind this?
=======
Edit: This is in the context of a discussion of martingales. Please see the screenshot below. Also if my question could be rephrased better, please help me do so.

You can derive this using the Law of Total Expectation, which states that $E[X] = E[E[X \mid Y]]$. With that, just note that: $$ E[e^{X_1+...+X_n}] = E[E[e^{X_1+...+X_n}\mid X_1,...,X_{n-1}]]= $$ $$= E[e^{X_1+...+X_{n-1}}E[e^{X_n} \mid X_1,...,X_{n-1}]]=$$
$$ = E [E[e^{X_1+...+X_{n-1}}\mid X_1,...,X_{n-2}]\cdot E[e^{X_n} \mid X_1,...,X_{n-1}]]= $$
$$= E [e^{X_1+...+X_{n-2}} E[e^{X_{n-1}}\mid \{X_j: j<n-1\}]\cdot E[e^{X_n} \mid \{X_j: j < n \}]] = $$ $$ =...=\prod_iE[e^{X_i}\mid \{X_j : j<i\}] $$
Which proves the identity.