Let $X\in\{0,1\}^d$ be a Boolean vector and $Y, Z\in\{0,1\}$ are Boolean variables. Assume that there is a joint distribution $\mathcal{D}$ over $Y, Z$ and we'd like to find a joint distribution $\mathcal{D}'$ over $X, Y, Z$ such that:
The marginal of $\mathcal{D}'$ on $Y, Z$ equals $\mathcal{D}$.
$X$ are independent of $Z$ under $\mathcal{D}'$, i.e., $I(X;Z) = 0$.
$I(X; Y)$ is maximized,
where $I(\cdot;\cdot)$ denotes the mutual information. For now I don't even know what is a nontrivial upper bound of $I(X;Y)$ given that $I(X;Z) = 0$? Furthermore, is it possible we can know the optimal distribution $\mathcal{D}'$ that achieves the upper bound?
My conjecture is that the upper bound of $I(X;Y)$ should have something to do with the correlation (coupling?) between $Y$ and $Z$, so ideally it should contain something related to that.
We have the following series of inequalities: \begin{align} I(X;Y) & \le I(X;Y,Z) \\ & = I(X;Z) + I(X;Y|Z) \\ & = I(X;Y|Z) \\ & = I(X,Z;Y) - I(Y;Z) \\ & \le H(Y) - I(Y;Z) \\ & \le 1 \text{ bit} - I(Y;Z) , \end{align} where in the third line we've used that $I(X;Z)=0$.