Mutual information between 2 sequences of random variables?

419 Views Asked by At

How would I go about expanding $I(X_1,...,X_n;Y_1,...,Y_n)$?

The chain rule exists for a single case, i.e.: $I(X_1,...,X_n;Y)=\sum^n_{i=1} I(X_i;Y|X_{i-1},...,X_1)$, but I'm having doubts if this can be applied to solve for the case of 2 sequences of random variables.

EDIT: the condition is that $(X_1,...,X_n)$ are independent, and I want to show that

$I(X_1,...,X_n;Y_1,...,Y_n)\geq \sum^n_{i=1}I(X_i;Y_i)$