This a follow-up to a question I had posted earlier.
Let $X$ and $Y$ be $\mathbb{R}^n$- and $\mathbb{R^m}$-valued random variables on the probability space $(\Omega, \mathcal{F}, P)$. Further assume that $X$ is $\mathcal{F}_1$-measurable and $Y$ is $\mathcal{F}_2$-measurable, where $\mathcal{F}_1$ and $\mathcal{F}_2$ are independent sub-$\sigma$-algebras of $\mathcal{F}$. Let $h : \mathbb{R^n} \times \mathbb{R^m} \rightarrow \mathbb{R}$ be $\mathcal{B}( \mathbb{R^n} ) \otimes \mathcal{B}( \mathbb{R^m} ) - \mathcal{B}(\mathbb{R})$-measurable and bounded. I want to show that
$$ E[h(X, Y) \mid \mathcal{F_1}] (\omega) = E[h (x, Y)] \big|_{x = X(\omega)}. $$
With the help of the answer to my previous question I have managed to show this relation for $h = 1_D (x ,y)$, $D \in \mathcal{B}( \mathbb{R^n} ) \otimes \mathcal{B}( \mathbb{R^m} )$. By the definition of the conditional expectation this means that $$ \int_{F} 1_D (X, Y) dP = \int_F E[1_D (x, Y)] \big|_{x = X} dP, \quad \text{for any } F \in \mathcal{F}_1. $$
By linearity, this is extended to all positive simple functions. I would like to extend the result to positive measurable functions and then to all bounded and measurable functions.
Some thoughts:
Let $h(x, y)$ be positive, measurable and bounded.
Then there exists an increasing sequence of positive simple functions $h_n ( x, y )$ converging pointwise to $h(x, y)$. Then also $h_n (X(\omega), Y(\omega)) \uparrow h( X(\omega), Y(\omega))$. Then using the monotone convergence theorem we can write \begin{align} \int_{F} h (X, Y) dP &= \int_{F} \lim_{n \rightarrow \infty} h (X, Y) dP = \lim_{n \rightarrow \infty} \int_{F} h (X, Y) dP \\ &= \lim_{n \rightarrow \infty} \int_F E[h_n (x, Y)] \big|_{x = X} dP = \ldots \end{align}
To be able to apply the monotone convergence theorem one would need to show that $$ \tag{1} E[h_n (x, Y)] \big|_{x = X(\omega)} \uparrow E[h (x, Y)] \big|_{x = X(\omega)}, \quad \text{for every } \omega \in \Omega \text{ (pointwise)}, $$ probably using that $$ h_n (x, Y(\omega')) \big|_{x = X(\omega)} \uparrow h (x, Y(\omega')) \big|_{x = X(\omega)} \quad \text{for every } (\omega, \omega') \in \Omega \times \Omega \text{ (pointwise)}. $$
Let $h(x, y)$ be measurable and bounded.
Write $h (X, Y) = h^{+} (X, Y) - h^{-} (X, Y)$. Then
\begin{align} \tag{2} \int_F h ( X, Y ) dP &= \int_F h^{+} ( X, Y ) dP - \int_F h^{-} ( X, Y ) dP \\ &= \int_F E[h^{+} (X(\omega), Y(\omega'))] d P ( \omega ) - \int_{F} E[h^{-} (X(\omega), Y(\omega'))] d P ( \omega ) \\ &= \int_F E[ h^{+} (X(\omega), Y(\omega')) - h^{-} (X(\omega), Y(\omega'))] d P(\omega)\\ &= \int_F E[ h (X(\omega), Y(\omega')) d P ( \omega ) \end{align}
As long as $(2)$ is correct, how can one show $(1)$? I would be interested in the different ways of showing $(2)$, if there are such. A self-contained proof would be best. If a certain proof relies on known results, then a reference would be nice. Thanks.
Your (1) is nothing but another application of the monotone convergence theorem. Fix $x$ for the moment. As you say, for each $\omega'$ we have $h_n(x, Y(\omega')) \uparrow h(x, Y(\omega'))$. Therefore by the monotone convergence theorem, we have $E[h_n(x, Y)] \uparrow E[h(x,Y)]$. Now $x$ was arbitrary, so this is true for every $x$; in particular it is true when $x = X(\omega)$, for any $\omega$.
Then (2) is just the linearity of the integral.