Why is relative entropy negative in this computation?

58 Views Asked by At

I am running some tests using relative entropy on physical systems in equilibrium, and I am seeing some strange results. I wonder if this is an issue in Mathematica itself, but here goes.

I have two systems, one defined by the potential $U_1 (x) = x^2 - 0.2x$ and another defined by the potential $U_2 (x) = 0.91x^2 -0.18x$, where $x\in (-\infty, \infty)$.

I am calculating the relative entropy $H$ between these two systems, where $$ H = \int_{-\infty}^{\infty} p_1(x) \log \frac{p_1 (x)}{p_2 (x)} dx $$

$$ p_i(x) = \frac{\exp (-\beta U_i (x))}{\int_{-\infty}^{\infty} \exp (-\beta U_i (x)) dx} $$ Given the definition of relative entropy, H has to be strictly non-negative. But when I plug in those quantities for $U_i$, I get H to be negative. This is my mathematica code for it:

enter image description here

How is this possible? Shouldn't relative entropy be strictly non-negative? Unless there is a mistake in my calculations somewhere?

I would appreciate any advice you have for me!