Assume X and Y are conditionally independent given Z: $P(X,Y|Z)=P(X|Z)P(Y|Z)$.
We are interested in the mutual information between P(X,Y) and P(Z): $I(X,Y;Z)$. Can we decompose this mutual information into two components reflecting the contribution of $X$ and $Y$ separately?
Concretely, can there be a functional $f$ and a binary function $g$ such that $I(X,Y;Z)=g(f(P(X,Z)),f(P(Y,Z)))$?
What I have tried:
$I(X,Y;Z)=H(X,Y)-H(X,Y|Z)=H(X,Y)-H(X|Z)-H(Y|Z)$
The second equality is due to the additivity of entropies of independent variables (it holds for each $z$). I am left with $H(X,Y)$ which doesn't easily decompose ($X$ and $Y$ are not necessarily independent when not conditioned on Z).
Note: this question is superficially very similar to this one, but it is not equivalent to it.