I've started this problem by defining two sigma algebras, one consisting of $F=(0, a, a^c, \Omega)$ and $G=(0,b,b^c,\Omega)$. The question I have that if I condition $E[1_a(w)\mid G]$, is $1_a(w)$ independent of $G$ then? ($1$ is indicator).
The problem:

Anyone has a solution to this problem?
Re the problem:
Everything works smoothly, really... Let us check that $\mathcal F=\sigma(\{b\})$, $\mathcal G=\sigma(\{c\})$, and $X=\mathbf 1_{\{a\}}$ is a "concrete example", most of the time.
To simplify the proof, introduce the (symmetrized) notations $$x=P(\{a\}),\qquad y=P(\{b\}),\qquad z=P(\{c\}),\qquad Y=\mathbf 1_{\{b\}},\qquad Z=\mathbf 1_{\{c\}},$$ and assume that $xyz\ne0$. Then $x+y+z=1$, $X+Y+Z=1$, and $$E(X\mid \mathcal F)=\frac{x}{1-y}(1-Y),\qquad E(Y\mid \mathcal G)=\frac{y}{1-z}(1-Z),$$ hence $$E(E(X\mid \mathcal F)\mid \mathcal G)=\frac{x}{1-y}\left(1-\frac{y}{1-z}(1-Z)\right)=\frac{x\,(x+yZ)}{(1-y)(1-z)}.$$ The RHS is symmetric with respect to the symmetry $(Y,y)\leftrightarrow(Z,z)$ if and only if $yZ=zY$, which never happens unless $yz=0$.
Re the question:
Well... This is difficult to parse, to say the least.
First, if $w$ is in $\Omega$, then $1_a(w)$ is a number, not a random variable, hence $E[1_a(w)\mid G]=1_a(w)$, trivially, and $1_a(w)$ is independent of $G$, trivially, hence the answer would be obvious. But what you have in mind is probably rather the question:
The trouble now is that the preliminary part "if one considers $E[1_a\mid G]$" is unrelated to the question part "is $1_a$ independent of $G$?". The random variable $1_a$ is independent of $G$ or not independent of $G$ whether one "considers" $E[1_a\mid G]$ or not. Thus, we seem to be left with the question:
...Whose best answer is that this depends on the hypotheses one makes. No question here. (By contrast, the problem copied from the book, to which we answered above, makes perfect sense.)