Mutual information between independent measurement of two dependent random variables

40 Views Asked by At

Suppose I have two dependent random variables, $X_1$ and $X_2$. I perform independent measurement on each of these two variables, and we denote the output random variable as $Y_1$ and $Y_2$, respectively. By independent measurement, I mean that $Y_1$ and $Y_2$ are conditional independent given $X_1$ (or $X_2$).

Is it possible to express $I(Y_1, Y_2)$ in terms of $I(X_1,X_2)$ in a simple form? This feels like this problem must have been worked on before, but I couldn't find any references to it. I'd appreciate any hint as to how to approach this.

EDIT: As requested, by conditional independence, I mean $P(Y_1,Y_2|X_1) = P(Y_1|X_1)P(Y_2|X_1)$ and $P(Y_1,Y_2|X_2) = P(Y_1|X_2)P(Y_2|X_2)$. By measurement, I just meant $Y_2$ depends on $X_2$ directly, so think of $X_2$ as input to a noisy channel and $Y_2$ as the output, and same for $X_1$. So in summary, two correlated input, each sent through two separate independent channels, and I'm interested in an expression for the mutual information (MI) of the outputs in terms of MI of inputs