In probability theorey there is a different form of Fubini's theorem that includes Markov kernels (regular conditional distributions) that does not need independence. Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space and $X:\Omega \rightarrow \mathcal{X}$ and $Y:\Omega \rightarrow \mathcal{Y}$ random variables. Then for a function $f:\mathcal{X}\times\mathcal{Y}\rightarrow \mathbb{R}$ that is either non-negative and measurable or integrable, the theorem is expressed as $$\int_{\mathcal{X}\times\mathcal{Y}}f(x,y)\mathbb{P}(d(x,y)) = \int_{\mathcal{X}}\biggl(\int_{\mathcal{Y}}f(x, y) \mathbb{P}_{Y\mid X}(dy, x)\biggl)\mathbb{P}_X(dx).$$
My question is if, like in the original theorem by Fubini, it is possible to change the order of integration, i.e., to first integrate over $y$ and then over $x$?
More specifically what I mean here is, if it is possible that we first view the conditional distribution as a measurable function on $\mathcal{X}$ for a fixed set $dy$ and then after integrating over $x$ we view it as a measure on $\mathcal{Y}$?
I am not sure how to write this formally, but something like this: $$ \int_{\mathcal{Y}}\biggl(\int_{\mathcal{X}}f(x, y) \mathbb{P}_X(dx)\mathbb{P}_{Y\mid X}(dy, x)\biggl) $$ Is this possible? Maybe yes under certain conditions?
The equality: $$\int_{\mathcal{X}}\biggl(\int_{\mathcal{Y}}f(x, y) d\mathbb{P}_{Y/X=x}(y)\biggl)d\mathbb{P}_X(x) = \int_{\mathcal{Y}}\biggl(\int_{\mathcal{X}}f(x, y) d\mathbb{P}_X(x)\biggl)d\mathbb{P}_{Y/X=x}(y).$$ doesn't make sense, unless $X$ and $Y$ are independent: the right-hand side depends on $x$, the first does not.