I'm trying to solve the following problem:
Given a discrete random variable $A$ with alphabet $\chi \subset \{ 1, 2, \dots\}$, what is the relationship between $H(A)$, the entropy of $A$, and $H(B)$, the entropy of $B$, when $B$ is defined as follows:
(a) $B = \log_2(A).$
(b) $B = A^2.$
As I understand, the problem is solved by looking at the range of the variables $A$ and $B$. Let $\chi$ be the range of $A$, then, by how $B$ is defined in both cases, we have that: $|\chi|=|range(B)|$. That is, we have a one-to-one correspondence and for that reason $H(A)=H(B)$.
Is my reasoning correct? Any suggestion is welcome. Thanks in advance.
It's true that, for discrete variables, a one-to-one function does not change the entropy. This should be obvious conceptually (entropy as a measure of information). A small proof follows.
In general (from the chain rule) we have $H(A,B)= H(A)+ H(B|A) = H(B) + H(A|B)$.
If $B=f(A)$ ($B$ is a deterministic funcion of $A$), then $H(B|A)=0$. If, besides, the function is one-to-one, so that $A=f^{-1}(B)$, then we also have $H(A|B)=0$.
Hence $H(A)=H(B)$