I am trying to understand the concept of independence of a random variable $X$ by a $\sigma$-algebra $\mathcal F$, and why $E[X|\mathcal F]=E[X]$. I cannot understand the last argument by a formal point of view. In facts, let X be a random variable over $(\Omega,\mathcal G,\mathbb P)$, and $\mathcal G$ independent than $\mathcal F$, using the definition of conditional expectation:
$E[E[X|\mathcal F]\mathbb{1}_F]=E[X\mathbb{1}_F]\,\forall F\in\mathcal F$ $\Rightarrow E[E[X|\mathcal F]\mathbb{1}_F]=E[X]\mathbb P[F]$.
If I could write $E[E[X|\mathcal F]\mathbb{1}_F]=E[X|\mathcal F]\mathbb P[F]$ then everything would make sense to me, but I don't see how I can say that the conditional expectation over a $\sigma$-algebra is independent by the indicator function over a subset belonging to the same $\sigma$-algebra. Would you please tell me how would you deal with this?
To check that $E[X\mid \mathcal{F}]=E[X]$, we need only check that $E[X]$ satisfies the definition of $E[X\mid \mathcal{F}]$. That is, we need to show that for all $F \in \mathcal{F}$, $$E[E[X]1_F] = E[X1_F].$$ Note that $E[X]$ is a constant, so we have $E[E[X]1_F]=E[X]E[1_F]$. So all we need to show is that $E[X]E[1_F]=E[X1_F]$. But this follows from the fact that $X \in \mathcal{G}$ and $1_F \in \mathcal{F}$, and $\mathcal{F}$ and $\mathcal{G}$ are independent $\sigma$-algebras.