When is the entropy H(X, Y) = H(X) = H(Y)?

229 Views Asked by At

I read in my textbook the following Corollary that follows from the chain rule for entropy: H(X, Y) >= H(X) or H(X, Y) >= H(Y)

I was wondering what the necessary condition is for equality? If X and Y are random variables of course.

1

There are 1 best solutions below

2
On BEST ANSWER

The immediate answer is

$\mathop {\lim }\limits_{Y \to f\left( X \right)} H\left( {X,Y} \right) = H\left( X \right)$ ,

where $Y \to f\left( X \right)$ denotes that the random variable $Y$ is approaching a deterministic function of the random variable $X$.

The mathematical complexity (which is why I write it as a limit) involves whether or not one admits that the form $p\left( {X,f\left( X \right)} \right)$ is actually an acceptable form of probability density or probability mass function. For example, if $Y = f\left( X \right) = X$ (that is, $Y$ is just $X$ repeated), many (most?) measure theorists would declare that $p\left( {X,X} \right)$, and thus $H\left( {X,X} \right)$ does not exist, because it is not supported over the full 2D space, but rather only along the line $Y = X$.