Let $f\in L^1(\mathbb{R}^d)$ be a probability density function of a probability measure in $\mathbb{R}^d$. Define the differential entropy of $f$ as the following functional
$$ \mathcal{H}(f)=-\int_{\mathbb{R}^d}f\log(f)\,dx.$$
Suppose I mollify $f$ using a standard mollifier $\rho_\delta$. Is there a way to relate the Differential entropy of $f_\delta=\rho_\delta*f$ to the differential entropy of $f$?
What I would like to conclude is that either
- $\mathcal{H}(f_\delta)\rightarrow \mathcal{H}(f)$ as $\delta\rightarrow 0$,
- or at least that if $\mathcal{H}(f)<\infty$, then $\mathcal{H}(f_\delta)$ is uniformly bounded for each $\delta<\delta_0$ and for some choice of $\delta_0>0$.
Searching for answers to this questions I found this article http://repositorio.uchile.cl/bitstream/handle/2250/125309/Piera_convergence.pdf?sequence=1, which provides a sufficient condition for a sequence of distributions $f_n$ to have converging differential entropies ($\mathcal{H}(f_n)\rightarrow \mathcal{H}(f)$), however that requires the condition
$$ \sup_{n>0}\left\|\frac{f_n}{f}\right\|_{L^{\infty}(d\mu)}<\infty,$$ where $d\mu=f(x)\,dx$. This condition seems to fail if the support of $f_n$ difers from that of $f$ by a set of positive Lebesgue measure, which happens if $f_n$ comes from mollification of $f$. So this result doesn't seem to do it.