If I have two random variables $Y$ and $U$ related as $Y=bU$, where $b>0$ is a constant and knowing that
$\text{H}(x)$ represents the shannon entropy, such that: $$ \text{H}(x)=−\int \text{p}(x) \ \text{log}_2(\text{p}(x)) \ dx $$
Then, what is the entropy of $\text{H}(Y)$ in terms of $U$? Can I expand $H(Y)$ in somehow approximately to this form: $\text{H}(U) - log_2(b)$?
Note that the density $p_Y$ of $Y$ is related to that of $U$ by $p_Y(y) = p_U(y/b)/b$. Then $$\eqalign{H(Y) &= - \int_\mathbb R p_Y(y)\; \log_2(p_Y(y))\; dy \cr &=- \dfrac{1}{b} \int_\mathbb R p_U(y/b)\; (\log_2(p_U(y/b)) - \log_2(b))\; dy\cr &= - \int_\mathbb R p_U(u)\; (\log_2(p_U(u)) - \log_2(b))\; du\cr &= H(U) + \log_2(b)}$$