Mutual information and computing one variable using the other

34 Views Asked by At

Is there any theory that connects between the mutual information of two random variables $X$ and $Y$ and the ability to approximate $f(X) \approx Y$ using a best fitting function $f$? Differently said, is there a measurement that connects $I(X;Y)$ and a quantity of the form $\displaystyle\inf_{f \in \mathcal{C}}\, \mathbb{E}_{(X,Y)}[\|f(X)-Y\|^2]?$