The relation between the entropy of random variables $X$ and $Y=g(X)$

532 Views Asked by At

A previous post has shown that for random variables $X$ and $Y=bX$, where $b > 0$, the entropy of $X$ and $Y$ are not equal (Entropy of $Y=bX$). However, wouldn't any bijection $g$ on a random variable $X$ yield $H(X)=H(g(X))$? It seems logical not to decrease entropy in this case, since we are essentially relabelling the outcomes of $X$.

1

There are 1 best solutions below

0
On

The post you are referring to is about differential entropy $h(X)$.

It's true that simply relabeling the outcomes of a discrete random variable $X$ cannot change its entropy $H(X) = -\sum_x p(x)\log p(x)$. Differential entropy is a similar quantity, but lacks some of the nice properties of $H(X)$, e.g., it can be negative, it changes with nonzero scaling, etc.

The relation between these two concepts of entropy is discussed at http://en.wikipedia.org/wiki/Differential_entropy