Can we do u-substitution $\frac{1}{a}$ instead of $\frac{1}{\lvert a \rvert}$ while proving scaling property of differential entropy?

55 Views Asked by At

$h$ is the differential entropy, defined by $h(X) = - \int f(x) \log(f(x)) dx$, and $X$ is a random variable.

Knowing that there's a "scaling" property of $h$ that states that: $h(Y) = h(X) + \log|a|$, being $Y=aX$, where $a \ne 0$ is a constant and $X$ is a random variable.

I was pondering on how we can proof it for both cases: $a > 0$ and $a < 0$. I came up with the following proof, but I am not sure if this substitution in the integral is legit. Could you please help me approach this integral?

When $X$ has the pdf $f(x)$, $Y$ has the pdf $\frac{1}{|a|}f(\frac{y}{a})$ by chain rule. This can be seen by taking the derivative of CDF.

After applying basic logarithm rule, our expression becomes:

$$h(Y) = -\int \frac{1}{|a|}f(\frac{y}{a}) \log\big(f(\frac{y}{a})\big)dy + \log(\lvert a \rvert) \int \frac{1}{\lvert a \rvert}f(\frac{y}{a}) dy $$

We do a substitution $x = \frac{y}{a}$ , then $dx = \frac{1}{a}dy$, can we do this substitution here and write (?):

$$h(Y)= - \int f_x(x) \log (f_x(x))dx + \log(\lvert a \rvert) \int f_x(x)dx $$ $$ h(Y) = h(x) + \log (\lvert a \rvert) $$

Will it be a legit change of variable for both cases, when $a >0$ and $a<0$? How can we treat absolute value in this case?