Bound on mutual information after nonlinearity

65 Views Asked by At

I can calculate the mutual information between two, (possibly multivariate) Gaussian random variables $I[X,Y]$. Is there anything I can say about the mutual information (perhaps establish a bound) between $I[X, f(Y)]$, where $f$ is some nonlinearity? I understand the the data processing inequality establishes $I[X,Y]$ as an upper bound, but is there a stronger statement I can make given that $X$ and $Y$ are Gaussian and if the nonlinearity is sufficiently "simple" (for example, a squared nonlinearity)?

1

There are 1 best solutions below

3
On

If $f$ is a 1-1 function then $f(Y)$ gives you the same info as $Y$ and so $I(X;Y)=I(X; f(Y))$. Formally, since you can invert $f$, we have by multiple uses of the data processing inequality $$ I(X;Y)\geq I(X;f(Y)) \geq I(X;f^{-1}(f(Y))) = I(X;Y)$$