$H(Y|X)=0$ if $Y=f(X)$ where $f$ is deterministic and $X$ is a continuous r.v.

378 Views Asked by At

I'd like to show that the conditional entropy $H(Y|X)$ is zero when $Y$ is deterministically determined by $X$ (i.e., $Y=f(X)$) and $X$ is continuous random variable.

This claim is easy to prove when $X$ is a discrete r.v., but I feel kind of clueless for continuous r.v. $X$.

Should I approach with Dirac delta function? I can have the below $$ H(Y|X) = \int_x p(x) \int_y p(y|x) \log p(y|x) dydx = \int_x p(x) \int_y \delta_{f(x)}(y) \log \delta_{f(x)}(y) dy dx $$ But from here, I am stuck.

1

There are 1 best solutions below

2
On BEST ANSWER

There is a bunch of stuff going on here.


First: the entropy definition for discrete rv's does not apply for continuous rv's. Instead, you can use the differential entropy $h(X)$ which is defined for continuous r.v.'s: $$h(X)=-\int_{\mathcal{X}}f_X(x)\log\big(f_X(x)\big)\,dx\,,$$ with $\mathcal{X}$ being the support of $X$ and $f_X(x)$ the PDF of $X$.

In the same way, the differential conditional entropy $h(Y|X)$ is given by $$h(X|Y)=-\int_{\mathcal{X,Y}}f_{XY}(x,y)\log\big(f_{X|Y}(x|y)\big)\,dxdy=h(X,Y)-h(Y)$$ with the corresponding definitions of support, joint PDF and conditional PDF. These are also used to define mutual information of continuous r.v.'s.

Also, note that differential entropy can be negative (because PDFs can be larger than 1), and that varies with scale.


Second, the conditional differential entropy $h(g(X)|X)$ is not necessarily zero. Thus, the statement you are trying to prove is not true. See this MSE question for a better exposition and discussion. Hope this helps!