Given a sample $\left\{ X_i: i=\overline{1,n} \right\}$ with values from $\mathcal{X}$ and their conditional distributions $\mathbb{P}\left( X_i \mid X_t, t \neq i \right)$, I want to find their joint distribution $\mathbb{P}\left( X \right)$ for all $X \in \mathcal{X}^n$.
The article Compatible Conditional Distributions contains necessary and sufficient conditions for conditional probabilities to correspond to some joint probabilities. Is it also sufficient for Gibbs Sampling convergence or only necessary? It seems like these conditions do not guarantee the reachability of states from some initial ones.
In the article Simple conditions for the convergence of the Gibbs sampler and Metropolis-Hastings algorithms sufficient condition for Gibbs Sampling convergence is provided: for any probable initial state $X^0$ and any probable final state $X^t$ there should exist a sequence of states $X^0, \dots, X^i, \dots, X^t$ such that $\mathbb{P}\left( X^i \mid X^{i-1} \right) > 0$ for all $i=\overline{1, t}$ for some finite $t$. Also the transitions should be aperiodic, meaning there does not exist a partition $\bigcup\limits_k A_k = \mathcal{X}$, $\bigcup\limits_{i \neq j} A_i \cap A_j = \emptyset$ such that $X$ "jumps" from one state to another in a specific order again and again.
Are there exist proven necessary and sufficient conditions for an ability to estimate joint distribution from conditional ones?