The kernel density estimator (when constructed using a Gaussian kernel) is
$\hat{f}(x)=\frac{1}{n} \sum_{i=1}^{n} \frac{1}{\sqrt{2 \pi \sigma^{2}}}e^{\frac{-(x-x_{i})^{2}}{2 \sigma^{2}}}$
for a data set of size $n$. Let $\sigma$ be the bandwidth.
I'm trying to find its differential entropy, as measured in nats (so that the base of the logarithm is e):
$- \int_{-\infty}^{\infty} \hat{f}(x) \ln(\hat{f}(x))dx$
I have no idea how to go about computing this. Can anyone tell me a closed form for the differential entropy of this kernel density estimator, for an arbitrary set of observations { $x_{1}, x_{2},... x_{n}$ }?
(I'm being very liberal with the phrase "closed form," I just need the integral to vanish.)