Justification of $h(aX) = h(X) + \log|a|$ (Differential entropy under scaling)

640 Views Asked by At

Being $h$ the differential entropy, defined by $h(X) = - \int f(x) \log(f(x)) dx$, where $X$ is a random variable.

I know there's a property of $h$ that states that: $h(Y) = h(X) + \log|a|$, being $Y=aX$, where $a \ne 0$ is a deterministic constant and $X$ is a random variable.

I was wondering which is the justification of this for the case of:

$Y=aX$, and the PDF of Y $f(y)=(1/a) \cdot f(x/a)$.

1

There are 1 best solutions below

6
On

As you mention, if $X$ has the pdf $f(x)$, $Y$ has the pdf $a^{-1} f(a^{-1}x)$. This can be seen by taking the derivative of the cdf $F_Y(y) = P(Y \le y) = P(X \le a^{-1} y)$, as $\frac{d}{d y} F_X(a^{-1} y) = a^{-1} F'_X(a^{-1} y)$ holds by the chain rule.

Therefore for $a > 0$, \begin{align} h(Y) & = -\int a^{-1} f(a^{-1}x) \log\big(a^{-1} f(a^{-1}x)\big) dx \\ & = -\int a^{-1} f(a^{-1}x) \log\big(f(a^{-1}x)\big) dx - \int a^{-1} f(a^{-1}x) \log(a^{-1}) dx \tag{1}\\ & = -\int f(x) \log(f(x)) dx + \log(a) \cdot \int a^{-1} f(a^{-1}x) dx \tag{2} \\ & = -\int f(x) \log(f(x)) dx + \log(a) = h(X) + \log|a| \tag{3} \end{align} holds.

We use the following three facts:

  1. $\log(a b) = \log(a) + \log(b)$ and $\log(a^{-1}) = - \log(a)$ for $a,b > 0$.
  2. $\int g(a^{-1} x) dx = a \int g(x) dx$.
  3. $\int f(x) dx = 1$, when $f$ is a pdf.