In a probability book I'm reading (Jacod and Protter) it states:
Let $Y \in L^2(\Omega,A,P)$, and $\mathscr{G}$ a sub $\sigma$ algebra of $A$. Then the conditional expectation of $Y$ given $\mathscr{G}$ is the unique element $E[Y|\mathscr{G}]$ such that $E[YZ]=E[E[Y|\mathscr{G}]Z]$ for all $Z \in L^2(\Omega,\mathscr{G},P)$.
In a HW problem, I'm trying to show for bounded $X$ and $Y$ that $E[YE[X|\mathscr{G}]]=E[XE[Y|\mathscr{G}]]$
With the above definition, this is trivial, so I would like to try to prove it with the definition of $E[Y|\mathscr{G}]$, where it is the random variable that is $\mathscr{G}$ measurable and for any $G \in \mathscr{G}$, we have $\int_G E[Y|\mathscr{G}] dP = \int_G Y dP$.
So could I get help on this... OR help showing the two definitions are equivalent?
To show that the two definitions are equivalent, just note that $\int_G {\rm E}[Y\mid \mathscr G]\,\mathrm dP=\int_G Y\,\mathrm dP$ for all $G\in\mathscr{G}$ implies that ${\rm E}[YZ]={\rm E}[Y{\rm E}[Y\mid\mathscr{G}]]$ holds for all $Z=\mathbf{1}_G$, $G\in\mathscr{G}$. Now, a standard argument shows that this in fact holds for all integrable , $\mathscr{G}$-measurable $Z$. The standard argument is to first look at simple functions and then extend to the general case.