The Kullback-Leibler divergence between two (discrete) probability distributions is defined as $$ D_{KL}(P\|Q) = \sum_i p_i \log \frac{p_i}{q_i}, $$ where $p_i$ is the probability that $P$ assigns to the event $i$, and $q_i$ is the probability assigned by $Q$.
I know that the quantity $D_{KL}(P\|Q) + D_{KL}(Q\|P)$ (symmetrised Kullback-Leibler divergence) is sometimes used, because it is symmetric and thus behaves more like a distance between the two distributions. But does anyone know of a case where the quantity $$ \sum_i (p_i-q_i) \log \frac{p_i}{q_i} $$ is used, and whether it has a standard name? I ask because it came up in some statistical mechanics work I'm doing and I want to know if it has an interpretation in terms of information theory, or any particularly interesting known properties.
Since \begin{align*} \sum_i (p_i - q_i) \log \frac{p_i}{q_i} &= \sum_i p_i \log \frac{p_i}{q_i} - \sum_i q_i \log \frac{p_i}{q_i}\\ &= \sum_i p_i \log \frac{p_i}{q_i} + \sum_i q_i \log \left(\frac{p_i}{q_i}\right)^{-1}\\ &= D_{KL}(P||Q) + D_{KL}(Q||P) \end{align*} the quantity in question is the symmetrised KL-divercence,
AB,