I have the following lemma in one of my probabilistic courses.
For $(X_{i})_{i}$ be a sequence of i.i.d random variables such that $\mathbb{P}(X=1)=\mathbb{P}(X=1)=\frac{1}{2}$ Let $\Omega_{n}=\sigma(X_{1},X_{2},..., X_{n})$.
For every $n \geq 0$ and $A_{n} \in \Omega_{n}$, $\mathbb{E} \left [X_{n+1} \mathit{1}_{A_{n}} \right ]=0$. Explain.
I'm not sure, what this means but here is what I think it means by the help of an example: Let X be a coin toss. After n tosses, $A_{n}$ tells us the information of what happened in said n tosses, but no matter what happened up until that point, $\mathbb{E} \left [X_{n+1} \right ] =0$. Am I right?