This is (part of) question 8.8 from the lecture notes on measure theoretic probability theory by P. Spreij.
Let $X$ and $Y$ be random variables and assume that $(X,Y)$ admits a density $f$ w.r.t. Lebesgue measure on $(\mathbf{R}^2, \mathcal{B}(\mathbf{R}^2))$. Let $f_Y$ be the marginal distribution of $Y$. Define $\hat{f}(x|y)$ to be \begin{equation} \hat{f}(x|y) = \begin{cases} \frac{f(x,y)}{f_Y(y)}&\text{ if $f_Y(y) >0$}\\ 0 &\text{ else.} \end{cases} \end{equation} Let $h:\mathbf{R} \to \mathbf{R}$ be $\mathcal{B}(\mathbf{R})$--measurable and assume that $\mathbf{E}|h(X)|<\infty$. Put $\hat{h}(y) = \int_\mathbf{R} h(x)\hat{f}(x|y)d x$. Show that $\hat{h}(Y)$ is a version of $\mathbf{E}[h(X)|Y]$.
So I need to show that $\hat{h}(Y)$ is $\sigma(Y)$-measurable and satisfies $\int_F \hat{h}(Y)d \mathbf{P}=\int_F h(X)d \mathbf{P}$ for all $\sigma(Y)$-measurable sets $F$. To show the latter first take some set $F$ that is $\sigma(Y)$-measurable and such that $f_Y$ is nowhere $0$ on $F$. Then
\begin{align} \int_F\hat{h}(Y)d \mathbf{P}&=\int_F \left[ \int_\mathbf{R} h(x)\hat{f}(x|y) dx \right] f_Y(y) dy \\ &=\int_F \left[ \int_\mathbf{R} h(x)\hat{f}(x|y) f_Y(y) dx \right] dy \\ &=\int_F \left[ \int_\mathbf{R} h(x)f(x,y) dx \right] dy . \end{align} Now I am not sure where to continue from here. We can apply for instance Fubini-Tonelli and continue as \begin{align} \int_F\hat{h}(Y)d \mathbf{P}&=\int_\mathbf{R} \left[ \int_F h(x)f(x,y) dy \right] dx \\ &=\int_\mathbf{R} h(x) \left[ \int_F f(x,y) dy \right] dx . \end{align} I want the inner integral here to be something like $\int_F f(x,y) dy = 1_F f_X(x)$ and then the argument would be done with some small details about sets where $f_Y(y)$ is allowed to be zero. That is assuming I made no mistakes.
But then I am still stuck on how to show that $\hat{h}(Y)$ is $\sigma$-measurable. Thanks in advance for any help!
Let us start with measurability of $\hat{h}(Y).$ It is enough to prove that the map $\hat{h}: \mathbb{R} \to \mathbb{R}$ defined by: $$ y \mapsto \int h(x)\hat{f}(x|y) dx $$ is measurable.
Indeed any measurable map of $Y$ is $\sigma(Y)-$measurable. To see that the map $\hat{h}$ observe that $\hat{f}(x|y)$ is measurable in $y$ for fixed $x$. Then approximating the integral with finite sums (each of which is again measurable w.r.t $y$) you recover measurability of $y$, since the limit of measurable maps is still measurable. You will need the integrability of $h(X)$ to make sure that the integral exists for alsmost all $y$, and that the approximations converge.
If you prefer a less hand-made argument, note that $\hat{h}(x) \hat{f}(x|y) \in L^1(\mathbb{R}^2, dx\otimes f_Y(dy))$. This follows from the calculation you wrote down, by estimating:
$$\int |\hat{h}(x) \hat{f}(x|y)| \ dx \otimes f_y(dy) = \int \int |h(x)|f(x,y) \ dx dy = \mathbb{E} |h(X)| < {+} \infty$$
by your assumption (with $\otimes$ I indicate product measures).
Then Fubini's theorem guarantees that the one-dimensional integral is a.s. finite and measurable w.r.t $y$.
Now, let us pass to the conditional probability property. Following the argument before $h(Y) \in L^1$. Hence we can use Fubini and in addition a theorem in your lecture notes guarantees that the conditional expectation exists. To see that the random variable $\hat{h}(Y)$ is a version of the conditional probability we simply need to verify:
$$\mathbb{E}[1_F \hat{h}(Y)] = \mathbb{E}[1_F h(X)], \ \forall F \in \sigma(Y).$$
Indeed assume $F = Y^{-1}(A),$ then we rewrite the left-hand side as:
$$ \int \int 1_A(y) h(x) f(x,y)\ dx dy = \mathbb{E}[1_A(Y) h(X)] = \mathbb{E}[1_F h(X)] $$ following your calculations.