Conditional DIfferential Entropy and functions

44 Views Asked by At

I was reading about differential entropy and tried to find a result that would tell me, given two (continuous) random variables $X,Y$ when is it that $$h(X|Y)=h(X|f(Y))?$$ I could not find such a statement in most of the textbooks nor in online resources. Intuitively, if $f$ is invertible then you should have the equality (being a 1-1 map) but is there something more general? Is there some known result that characterizes this equality fully? In case someone is not familiar with the concept: $h(X|Y) = -\mathbb{E}[\log f_{X|Y}]$, where $f_{X|Y}$ is the conditional pdf of $X$ and $Y$.