Is it possible to encode the mutual information $I(X;Y)$ between two random variables $X$ and $Y$ into another random variable $Z$, such that $Z$ "contains" exactly the information that $X$ and $Y$ give about each other, i.e., $H(Z) = I(X;Y)$, $I(X;Z) = I(X;Y)$ and $I(Y;Z) = I(X;Y)$. I suppose one can view this as finding a $Z$ so that $H(Z) = I(X;Y)$ and $X \rightarrow Z \rightarrow Y$ is a Markov chain. Not sure if that helps.
Put another way, I want a setting in which the Venn diagram imagination is actually precisely correct, i.e. a setting where I can actually point to a random variable $Z$ and say that this $Z$ is exactly the intersection in the Venn diagram.
Here is a simple setting where I can find such a $Z$. Let $Z, a, b$ be indicator random variables such that $a$ and $b$ are independent. Let $X = (Z, a)$ and $Y = (Z, b)$. Then it's intuitively clear that $I(X;Y) = I(X;Z) = I(Y;Z) = H(Z)$. Formally:
$$I(X;Z) = H(Z) - H(Z|X) = H(Z)$$ $$I(Y;Z) = H(Z) - H(Z|Y) = H(Z)$$ $$I(X,Y) = H(X) - H(X|Y) = H(Z,a) - H(a|Z) = H(Z)$$
Can we find such a $Z$ in a more general setting?