Push Forwards and Change of Variable Confusion

600 Views Asked by At

Taking a look at the Push-Forward measure and change of variable formula https://en.wikipedia.org/wiki/Pushforward_measure I thought that for a measure $\gamma$ on $\mathbb{R}^d$ (with density also denoted $\gamma$) and a map $f:\mathbb{R}^d\to \mathbb{R}^d$, and $g:\mathbb{R}^d \to \mathbb{R}$

$$ \int_{\mathbb{R}^d} g(f_{\#}\gamma(x)) df_{\#}\gamma = \int_{\mathbb{R}^d} g(\mu(x)) d\gamma. $$

In particular denoting Bolztman entropy as $H(\gamma):=\int \log\gamma(x) d\gamma$ this would imply

$$H(f_{\#}\gamma)-H(\gamma)=0~?$$

1

There are 1 best solutions below

4
On BEST ANSWER

You really need to sort out your notation and typos.

You write:

$$ \int_{\mathbb{R}^d} g(f_{\#}\gamma(x)) df_{\#}\gamma = \int_{\mathbb{R}^d} g(\mu(x)) d\gamma. $$

I'm assuming that $\mu$ is $\gamma$ and that's just a typo.

With that reading the right hand side $\int_{\mathbb{R}^d} g(\gamma(x)) d\gamma $ could make sense if $\gamma$ is both density and measure (bad notation). Then $\gamma(x)$ is the value of the density function, and $\gamma(x)\in \mathbb{R}$. But what is $g(\gamma(x))$? Your $g$ is of a wrong type for this to make sense. Do you want $g$ to be from $\mathbb{R}\to \mathbb{R}$? Or do you just want to write $g(x)$? Presumably the later.

The left hand side $ \int_{\mathbb{R}^d} g(f_{\#}\gamma(x)) df_{\#}\gamma$ has even more problems. Here $f_{\#}$ appears twice. Once as a pushforward of measure $\gamma$, and once, presumably, as pushforward of function $\gamma$ (?). The problem is, functions don't push forward, they pull back: for a function $h(y)$ we have $f^*(h)(x)=h(f(x))$. The whole expression is ill-formed.

Compare with what is written in Wiki:

$$\int_{X_2} h \, d(f_* \mu) = \int_{X_1} h \circ f \, d\mu$$

Here $h:\mathbb{R^d} \to \mathbb{R}$. Let $g(x)=h(f(x))$ for some $h(y)$. Then we get:

$$\int_y h(y) d(f_* \mu)=\int_x g(x) d \mu$$

Now if $g(x)=\log p(x)$ where $p(x)$ is the density of $\mu$, then the right hand side is the entropy $H(\mu)$. The left hand side is

$$\int_y h(y) d(f_* \mu)$$.

Assuming that $f$ is invertible, $h(y)=g(f^{-1}(y))$. Then the left hand side is

$$\int_y h(y) d(f_* \mu)=\int_y g(f^{-1}(y)) d(f_* \mu)= $$

$$\int_y \log p(f^{-1}(y)) d(f_* \mu)$$

Now, IF density of $f_* \mu$ was $p(f^{-1}(y))$, this would be the entropy of $f_* \mu$, BUT IT IS NOT.

In fact the density of $f_* \mu$ is $$q(y)=p(f^{-1}(y))|Df^{-1}(y)|$$

Then $$H(f_* \mu)=\int_y \log [p(f^{-1}(y))|Df^{-1}(y)|] d(f_* \mu)=$$

$$\int_y \log [p(f^{-1}(y))]+ \log|Df^{-1}(y)|] d(f_* \mu)=$$

$$\int_y \log [p(f^{-1}(y))] d(f_* \mu)+ \int_y \log|Df^{-1}(y)| d(f_* \mu)=$$

$$H(\mu)+E_{f_* \mu}[ \log|Df^{-1}(y)|]=H(\mu)+E_{\mu}[ \log (1/|Df(x)|)]= H(\mu)-E_{\mu}[ \log |Df(x)|]$$

in agreement with https://en.wikipedia.org/wiki/Differential_entropy#Properties_of_differential_entropy

Some remarks:

  1. Denoting the measure and its density by the same letter is a bad idea in this setting, because they don't transform the same way. If your measure is $\gamma$ has density $p(x)$ aka $\gamma=p(x)dx$ then the pushforward $f_{\#}\gamma$ has density $p(f^{-1}(y))|Df^{-1}(y)|$ aka $f_{\#}\gamma=p(f^{-1}(y))|Df^{-1}(y)|dy$ -- mnemonically $p(x)dx=p(x(y)) |dx/dy| dy$. Thus when computing the entropy you will not be integrating the transformed function $\log p(x(y))$ but rather the transformed log-density i.e. $\log (p(f^{-1}(y))|Df^{-1}(y)|)$, so the integrals will not be equal.

  2. The underlying reason for this is that the entropy of a measure is computed relative to the "background" Lebesgue measure (all the density functions are computed with respect to the Lebesgue measure - as Radon-Nikodym derivatives, if that means anything to you), and the Lebesgue measure on the target is not the pushforward of the Lebesgue measure on the domain, which is what makes the entropy not invariant under the coordinate changes.