How to minimize mutual information between a variable and a correlated sequence of variables?

86 Views Asked by At

I'm working on a problem of minimizing mutual information between a variable and a correlated sequence of variables. Formally, I have random variable $Y$ and a sequence of random variables $X_{1}, X_{2}, ..., X_{n}$, and I want to minimize $I(Y; X_{i})$ for $i \in {1,2,...,n}$.

What I know is that $X_{i}$ are highly correlated, which means that $I(X_{i};X_{j})$ may be quite small, but there is no mathematical constraint to guarantee it.

I know that the most straightforward method should be minimize $I(X_{i}; Y)$ one by one, but I want to figure out if there exists simpler way to do this.

I tried to use the concavity property of entropy to find a upper bound, I have: \begin{align} I(\frac{\sum_{i}{X_{i}}}{n};Y) &= H(\frac{\sum_{i}{X_{i}}}{n}) - H(\frac{\sum_{i}{X_{i}}}{n}|Y) \\ &\geq \frac{1}{n}\sum_{i}H(X_{i}) - H(\frac{\sum_{i}{X_{i}}}{n}|Y). \end{align}

But since the sign of $H(\frac{\sum_{i}{X_{i}}}{n}|Y)$ is negative, I cannot do anything further.

Does anyone have an idea of this problem ?

Thank you for your help !