Independences implied by the Maximum Entropy distribution, subject to dependency constraints

33 Views Asked by At

Consider a discrete distribution $P(X,Y,Z)$. If we wish to find the maximum entropy distribution (MaxEntDist) of P subject to maintaining the marginals $P(X), P(Y)$ and $P(Z)$, it is given by $P_{X:Y:Z} = P(X)P(Y)P(Z)$.

The $P_{X:Y:Z}$ notation is meant to say that $P_{X:Y:Z}$ is the MaxEntDist that satisfies $X \perp Y$ and $X \perp Z$ and $Y \perp Z$. I saw this notation in the article "Graphical Models in Reconstructability Analysis and Bayesian Networks" from Harris and Zwick.

Another example that is trivial to compute is the MaxEntDist that keeps the marginals $P(X,Y)$ and $P(Z)$. We represent it by $P_{XY:Z}$ and it is given by $P_{XY:Z} = P(X,Y)P(Z)$. If $P$ had any other dependency structure on the marginals besides the ones referred above would mean $P_{XY:Z}$ has lower entropy. Another trivial example is $P_{XZ:YZ} = P(Z)P(X|Z)P(Y|Z)$, which implies the conditional independences $X \perp Y | Z$.

On the other hand, if the dependency structure is cyclic, such as for example, $XY:XZ:YZ$, then there is no immediate formula for $P_{XY:XZ:YZ}$, which has to be approximated by iterative methods, from what I've read somewhere.

My question is two-fold. Firstly, what are the (conditional) dependencies yielded by $P_{XY:XZ:YZ}$? Second, is there an algorithm to convert dependency structures into a list of dependencies yielded by the resulting distribution?

Thank you.