I am considering a Gibbs sampling algorithm where I want to start from a given pair of conditional distributions $f(y \mid x)$ and $f(x \mid y)$ and form a Markov chain by sampling $Y_t \sim f(y \mid X_t)$ and $X_{t+1} \sim f(x \mid Y_t)$.
I am willing to make the following assumptions to try to establish that the stationary distribution exists:
Both $f(y \mid x)$ and $f(x \mid y)$ are densities with respect to Lebesgue measure and are compactly supported.
For both densities we have $\epsilon \le f(y \mid x) \le \epsilon^{-1}$ and $\epsilon \le f(x \mid y) \le \epsilon^{-1}$.
I guess it should follow from this that the chain is irreducible, aperiodic, and recurrent, hence the stationary distribution should exist. But when I try to derive it, I seem to get intro trouble, which makes me doubt the reasoning I gave. I tried checking directly by trying to find $f(x,y)$ such that
$$ f(x', y') = \int f(x,y) \, f(x \mid y') \, f(y' \mid x') \ dx \ dy $$
but this seems to suggest that $f(x,y)$ would necessarily have conditionals $f(x \mid y)$ and $f(y \mid x)$. If this were true, then we would have the explicit formula for the joint:
$$ f(x,y) = \frac{f(y \mid x)}{\int \frac{f(y \mid x)}{f(x \mid y)} \ dy}. $$
However, I can't see any reason in particular that we would have
$$ \frac{f(y \mid x)}{\int \frac{f(y \mid x)}{f(x \mid y)} \ dy} = \frac{f(x \mid y)}{\int \frac{f(x \mid y)}{f(y \mid x)} \ dx} $$
given only the assumptions listed above.
So where am I going wrong? Am I not assuming enough to get a stationary distribution, and if the stationary distribution does exist why does it seem to point me in the direction of something that I don't think should work?