$$ \int_x \delta(x)~\ln(\delta(d))~dx = 0 ? $$
Where $\delta(x)$ denotes the Dirac-Delta function, $ln(\cdot)$ is a logarithm, and $dx$ is simply the diferential of $x$ for the integral.
I'm working on an applied probability problem whose behavior can be neatly explained if I manage to prove the above equality. To be honest, it appears that the integral is equal to zero, but I can't really be sure.
Somehow it makes sense to me: $\delta$ is zero everywhere but at $x=0$, hence the first $\delta$ in the integral should become equal to $1$, and the other should lead to a logarithm of 1, which is zero (i.e. $\int \ln(\delta) = \log(1) = 0$); which in the end should lead to $1\cdot 0 = 0$.
However, I completely fail to come up with a formal explanation. I'm probably missing some important step or property when working it out on a paper. Or worst case, the integral does not even evaluate to 0. How should I tackle the problem?
P.S.: Feel free to change the tags of my question, I might have gotten them wrong.
Use per partes, the definition of Dirac delta function and the fundamental theorem of calculus as follows:
$$ u = \ln\delta(x)\quad \Rightarrow\quad du = \frac{d}{dx}\ln\delta(x) \\ dv = \delta(x)dx\quad \Rightarrow\quad v = \int\delta(x)dx = 1 \\ \int\delta(x)\ln\delta(x)dx = uv - \int vdu = \ln\delta(x) - \int\frac{d}{dx}\ln\delta(x)dx = \ln\delta(x) - \ln\delta(x) = 0 $$