On page 197, Probability: Theory and Examples by Rick Durrett(See here), there's an example of regular conditional distribution
Suppose $X$ and $Y$ have a joint density $f(x,y) > 0$. Let $$\mu(y, A) = \frac {\int_A f(x,y) dx}{\int f(x,y) dx}$$ Show that $\mu (Y(\omega) , A)$ is a r.c.d for $X$ given $\sigma (Y)$.
Question: Why this is a r.c.d?
Recall the definition of r.c.d: Given $(\Omega, \mathcal{F}, P)$, a measurable map $X : (\Omega, \mathcal{F}) \to (S, \mathcal{S})$, a $\sigma$-filed $\mathcal G \subseteq \mathcal F$,$\mu : \Omega \times \mathcal S \to [0,1]$ is a r.c.d if (i) For each A, $\omega \mapsto \mu (\omega , A)$ is a version of $P(X \in A \mid \mathcal G)$. (ii) For a.e. $\omega$, $A \mapsto \mu (\omega , A)$ is a probability measure on $(S, \mathcal{S})$.
In our case, $\mathcal S$ is $\sigma(X)$; $\mathcal G$ is $\sigma(Y)$; $\sigma(Y), \sigma(X) \subseteq \mathcal{F}$.
For (ii), it suffice to check the definition of a probability measure, and indeed it is.
For (i) $P(X \in A \mid \sigma(Y))$ is $\mathbb{E}(I_{X^{-1}(A)} \mid \sigma(Y))$. For $B \in \sigma(Y)$, and $P(B) < 1$, $$\int_B I_{X^{-1}(A)} dP = P(X^{-1}(A) \cap B)$$. $$\int_B\mu (Y(\omega) , A)dP = \int_B \frac {\int_A f(x,Y(\omega)) dx}{\int f(x,Y(\omega)) dx}dP = \frac{P(X^{-1}(A) \cap B)}{P(B)} > \int_B I_{X^{-1}(A)} dP = \int_B \Bbb E({I_{X^{-1}(A)} \mid \sigma(Y))} dP$$
The claimed equality $$\int_B \frac {\int_A f(x,Y(\omega)) dx}{\int f(x,Y(\omega)) dx}dP = \frac{P(X^{-1}(A) \cap B)}{P(B)} \qquad (*) $$ is not true. The right side is equal to $$\frac {\int_B \int_A f(x,Y(\omega)) dx\,dP}{\int_B \int f(x,Y(\omega)) dx\,dP}$$ but of course it is not in general true that $\int (u/v) = \int u / \int v$.
To prove the correct statement, first recall that $B \in \sigma(Y)$ iff $B = Y^{-1}(C)$ for some $C \in \mathcal{S}$ (easy exercise from the definitions). Also, by definition of density function, for every measurable $F : \mathbb{R} \to \mathbb{R}$, we have $E[F(Y)] = \int F(Y(\omega))\,dP = \iint_{\mathbb{R}^2} F(y)f(x,y)\,dx\,dy$.
So let $B = Y^{-1}(C) \in \sigma(Y)$ be arbitrary, and take $$F(y) = 1_C(y) \frac{\int_A f(\xi,y)\,d\xi}{\int f(\xi,y)\,d\xi}$$ so that the left side of (*) is $E[F(Y)]$. (I changed the dummy variables from $x$ to $\xi$ to avoid confusion below.) Then as we just claimed, and using Fubini's theorem, $$\require{cancel}\begin{align*} E[F(Y)] &= \iint_{\mathbb{R^2}} F(y) f(x,y)\,dx\,dy \\ &= \int_{\mathbb{R}} F(y) \left(\int_{\mathbb{R}} f(x,y)\,dx\right) \,dy \\ &= \int_{\mathbb{R}} 1_C(y) \frac{\int_A f(\xi,y)\,d\xi}{\cancel{\int f(\xi,y)\,d\xi}} \cancel{\left(\int_{\mathbb{R}} f(x,y)\,dx\right)} \,dy \\ &= \int_{\mathbb{R}} \int_{\mathbb{R}} 1_C(y) 1_A(\xi) f(\xi,y)\,d\xi\,dy \\ &= E[1_C(Y) 1_A(X)] = P(Y^{-1}(C) \cap X^{-1}(A) = P(B \cap X^{-1}(A)). \end{align*}$$ Note the cancellation is legitimate because we assume $f > 0$ and hence $\int f(\xi, y)\,d\xi > 0$ for every $y$.