KL divergence between two normal distributions can be negative?

216 Views Asked by At

KL Divergence between two normal distributions can be negative?

There are many derivations of KL divergence of two normal distribution. The equation is as follows:

$$D_{KL}(p||q) = \frac{1}{2}\biggl[log\frac{|\sum_{q}|}{|\sum_{p}|} -k + (\mu_{p} - \mu_{p})^T\sum\nolimits_{q}^{-1}(\mu_{p} - \mu_{p}) + tr(\sum\nolimits_{q}^{-1}\sum\nolimits_{p})\biggr]$$ $$k \text{ is the dimension of normal distribution}$$ $$\mu_p \text{ is the mean of the normal distribution } p$$ $$\mu_q \text{ is the mean of the normal distribution } q$$

Far as I know, KL divergence is always greater than or equal to 0. My question is the equation above can be negative? (Actually, I implemented it, but I am frequently getting negative results)