Assuming all relevant expectations exist and $g$ and $h$ are measurable functions.
I am aware of the definition that (if I state it correctly):
$E[X|Y]\ is\ a\ \sigma(Y)-measurable\ function\ s.t.\int_A \! E[X|Y] \, \mathrm{d}P = \int_A \! X \, \mathrm{d}P\ for\ any\ A\in\sigma(Y)$
I am not sure how can I prove the equality straight from this definition, maybe it is because I did not understand the concept correctly. Could someone help me with the proof or the definition in general?
p.s. Is it possible to make use of some theorem such as the Monotone Convergence Theorem, to approximate the integral?
Your equation follows from two facts.
Let's assume that (1) is given, as it is not a fact about conditional expectations, and sketch a proof of (2).
The main thing is to show
since conditional expectations are unique a.s.
Reduce to the case $X, Y \geq 0$ and assume $X = \mathbf{1}_{F_{0}}$ for $F_{0} \in \mathcal{F}$. Then
So $(\star)$ holds when $X$ is an indicator function. We can then extend to simple $X$ using the linearity of conditional expectation and then to general nonnegative $X$ using the monotone convergence theorem for conditional expectation. For this latter step, approximate $X$ by simple $X_{n} \nearrow X$. Then, using monotone convergence, the result for simple $X$, and then monotone convergence once more, we have
And for general $X$ and $Y$, we can write $X = X^{+} - X^{-}$ and $Y = Y^{+} - Y^{-}$.