Let $X_1, X_2,\ldots$ be i.i.d r.v's with mean $\mu$. Let
$$S = \sum_{i=1}^n X_i$$
Let $F_n$ denote the information contained in $X_1,\ldots, X_n$ Show that
$$E[S_n \mid \mathcal{F}_m] = S_m + (n-m)\mu$$ $$m<n$$
Solution:
Let $m<n$, then by the linearity property of martingales
$$E[aY_{1} + bY_2 \mid \mathcal{F}_k] = E[aY_1 \mid \mathcal{F}_k] + E[bY_2 \mid \mathcal{F}_k]$$
we have that $(i)$
$$E[S_n \mid \mathcal{F}_m] = E[X_1+\cdots+X_m \mid \mathcal{F}_m] + E[X_{m+1}+\cdots+X_n \mid \mathcal{F}_n]$$
Since $X_1 +\cdots+X_m$ is measurable with respect to $F_m$ $$E[X_1+\cdots+X_m \mid \mathcal{F}_m] = X_1 +\cdots+ X_m = S_m$$
Since $X_{m+1} +\cdots+X_n$ is independent of $X_1 +\cdots+X_m$ $(ii)$ $$E[X_{m+1}+\cdots+X_n \mid \mathcal{F}_n] = E[X_{m+1}+\cdots+X_n] = (n-m)\mu$$
$(i)$ I dont understand how this follows from the linearity property. How can they use both $F_m$ and $F_n$ there?
$(ii)$ I dont understand how $X_{m+1} +\cdots+X_n$ independent of $X_1 +\cdots+X_m$ can lead to
$$E[X_{m+1}+\cdots+X_n \mid \mathcal{F}_n] = E[X_{m+1}+\cdots+X_n]$$
I mean they condition on $F_n$.
I think there are some mistakes in the calculations (or typographical mistakes, confusing $n$ and $m$, although the result in the end is correct).
Firstly let me explain the condition $ \mid \mathcal F_m$. Simply stated: $\mathcal{F}_m$ contains everything you need to know about $X_1$ to $X_m$. So, given $\mathcal{F}_m$ (or conditional on $\mathcal{F}_m$), $X_1, \ldots, X_m$ are known. Of course if $m<n$ then $\mathcal{F}_n$ contains all the information in $\mathcal{F}_m$ and additionally all the information about $X_{m+1}$ up to $X_n$.
Correct evaluation of the statement to prove, should be as follows: $$E[S_n \mid \mathcal{F}_m]=E[X_1+\cdots+X_{m} \mid \mathcal{F}_m]+E[X_{m+1}+\cdots+X_n \mid \mathcal{F}_m]$$ by linearity. From this it should be clear that
$$E[X_1+\cdots+X_m \mid \mathcal{F}_m]=X_1+\cdots+ X_m=S_m$$ if you know $\mathcal{F}_m$ then you know $X_1, \ldots, X_m$. Now, for the second term, given $\mathcal F_m$ we do not know the values of $X_{m+1},\ldots, X_n$. But because they are independent from what happened in the past, up to time $m$, i.e. they are independent from $\mathcal F_m$ we can ignore this information and write $$E[X_{m+1}+\cdots+X_n \mid \mathcal{F}_m]=E[X_{m+1}+\cdots+ X_n]=(n-m)μ$$