The following problem is giving me a bit of a headache:
Let $X, Z$ be a pair of random variables. Under which conditions can I construct a random variable $Y$ such that $X \rightarrow Y \rightarrow Z$ is a Markov chain? In order to avoid trivial solutions such as $X=Y$ I additionally require the construction to be symmetrical with respect to $X$ and $Z$. That is, if I change $X$ and $Z$ when defining $Y$, I obtain the same random variable.
I am thinking of $Y$ of some sort of variable that captures the common "information" between $X$ and $Z$. Maybe it even holds that $I(X, Y) = I(Y, Z)$?
So far I have not even been able to find such $Y$ under the assumption that $X$ and $Z $ to be correlated standard normal distributions (using the standard Brownian process did not work). Hence any answer under any assumption is very welcome :)
When I say that the construction of $Y$ should be symmetrical I mean that the following. Let $\Omega$ be the space on which the variables $X, Y, Z$ are defined and let us denote by $\gamma*f$ the push forward of $\gamma$ through a function $f$. Then we are looking for a map $\pi$ that associates a joint probability distribution $\gamma$ on $\Omega^2$ to a probability distribution $\pi(\gamma)$ on $\Omega^3$. This map must satisfy the following requirements:
- (Marginal condition) For all joint probability distributions $\gamma$ it must hold that: $\pi(\gamma)_*p_{1,3} = \gamma$ $ \text{where} \ p_{1,3}(x, y, z) = x, z$
- (Symmetry condition) $\pi(\gamma)=\pi(\gamma_*f) \quad \text{where} \ f(x, z) = z, x$
- (Markov condition) $(X, Y, Z)\sim \pi(\gamma)$ imples that $X \rightarrow Y \rightarrow Z$ is a Markov chain.
Edit: After discussing this question in the comments I realized that such a $\pi$ is easy to construct if we create some decision rule that sets $Y=X$ of $Y=Z$ based on some symmetric condition on $X, Z$..