Minimum between two mutual information

32 Views Asked by At

Given a discrete distribution $P(X_1,X_2,Y)$ is it possible to build $P(X,Y)$ such that $$ I(X;Y) = \min \{ I(X_1;Y), I(X_2;Y) \} $$ where $I$ is Shannon's mutual information?

1

There are 1 best solutions below

0
On

Trivially I would define $P^{X, Y}=P(X, Y)$ by $P^{X, Y} = P^{X_1, Y}$ if $I(X_1, Y) \leq I(X_2, Y)$ and $P^{X, Y} = P^{X_2, Y}$ else. But maybe this is not what you actually want?

(For me $P^{X_1, Y}$ indicates the distribution of $X_1, Y$ and can be obtained by $P^{X_1, X_1, Y}$. I hope the notation is clear.)