Define the entropy operator of a distribution as $\mathbb{H}(p) = -\int p \log p$, how does the entropy change for distributions that are proportional to the powers of $p$?
For example, define $\tilde{p} = \frac{\sqrt{p}}{\int \sqrt{p}}$, can we say anything about the relation between $\mathbb{H}(p)$ and $\mathbb{H}(\tilde{p})$? If such a general rule is impossible, what if we restrict $p$ to be in certain families of distributions (e.g. exponential family)?
The paper
"Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels (Lapidoth-Moser-2003)"
has some nice results. For example:
$$h(log(X))=h(X)-\mathbb{E}[log(X)],$$ $$h(X^2)=h(X)+\mathbb{E}[log(X)]+log(2),$$
which are basically derived by change of random variable and considering the Jacobian of the transformation. For powers of $X$ also, you can consider change of random variable and the corresponding Jacobian of COV.