The tight bound for conditional mutual information: how much could conditional mutual information be greater than mutual information?

312 Views Asked by At

Given random variables $X$,$Y$ and another random variable $Z$, it is known that there are cases when the conditional mutual information $I(X;Y|Z)$ is greater than mutual information $I(X;Y)$.

For example, let $Y = X + Z$ and $X \perp Z$. It could be shown that: \begin{align} I(X;Y|Z) = H(Y|Z) = H(X) \geq I(X;Y). \end{align} However, what is the upper bound of $I(X;Y|Z)$ given $X$ and $Y$? We want to solve the following maximum problem: $$ \max_Z I(X;Y|Z), $$ where $Z$ is any random variable.

An upper bound (not necessarily tight) of $I(X;Y|Z)$ could be readily obtained as: \begin{align} I(X;Y|Z) &= I(X;Y,Z) - I(X;Z) \\ &= H(X) - H(X|Y,Z) - I(X;Z) \\ &\leq H(X). \end{align} With symmetry, we also have $I(X;Y|Z) \leq H(Y)$. To summarize, $$ \max_Z I(X;Y|Z) \leq \min\{ H(X),H(Y)\}. $$ However, could the inequality above be an equality? Or else, for any given random variables $X$,$Y$ (with given joint distribution), what is the exact solution to $\max_Z I(X;Y|Z)$?

1

There are 1 best solutions below

0
On

Interesting question. I don't know the answer. Only a comment to note that, for the bound to be attainable, say, for having the equality be true:

$$ I(X;Y|Z) = H(X) - H(X|Y,Z) - I(X;Z) = H(X)$$

... one would need to construct a variable $Z$ such that $X = g(Y,Z)$ ($X$ is determined by $Y$ and $Z$), but $X,Z$ are pairwise independent.

For joint Bernoulli variables we have the standard example: $X,Y$ are fair independent Bernoullis, and $Z=X+Y \pmod 2$. But I suspect that this is quite exceptional.