Entropy of intervening variable in Markov Chain

77 Views Asked by At

Let's assume we are given discrete random variables $X$, $Z$, with some nonzero mutual information $I[X,Z] > 0$. I would like to understand the minimum entropy of variables $Y$ such that $X \rightarrow Y \rightarrow Z$ is a Markov chain. By the data processing inequality, $H[Y] \geq I[X,Z]$. But can we also find some $Y$ for which this bound is close to tight, that is $H[Y]$ close to $I[X,Z]$?

More formally: Let $F$ be the set of variables $Y$ such that $X \rightarrow Y \rightarrow Z$ is a Markov chain. Can we upper-bound $\inf_{Y\in F} H[Y]$ in relation to $I[X,Z]$? Or are there conditions under which such an upper bound holds? I'd be interested in any kind of upper bound on $\inf_{Y\in F} H[Y]$, even if it is much larger than $I[X,Z]$, as long as it is better than $\inf_{Y\in F} H[Y] \leq H[X], H[Z]$.

I have looked around the web but didn't find anything related. Any pointers (or an explanation why this isn't possible) would be greatly appreciated. Thanks!