e raised to the power of negative Kulback-Leibler divergence

289 Views Asked by At

When looking at the similarity between two distributions, I found the Bhattacharyya coefficient to be pretty intuitive to understand since it is normalized to 1 ($0\leq BC(p,q)\leq 1$) and the coefficient itself may be understood as a fraction of overlap between the two distributions (I may be wrong - please point it out if I am).

The relation between the Bhattacharyya Distance ($D_B$) and the coefficient is, $$ BC = e^{-D_B} $$ We know that $0\leq D_B< \infty$, a non-normalized positive number.

Moving on to the Kulback-Leibler Divergence ($D_{KL}$), which also happens to have the property $ 0\leq D_{KL}< \infty$, is there anything intuitive about the quantity $e^{-D_{KL}}$? So far I've just come across one paper talking about the properties of this (http://cm00.epage.au.edu.tw/ezfiles/7/1007/img/2852/24(2)7-3(141-151).pdf). It also raises the question, why $e^{(.)}$ and why not any constant $a$ raised to the coefficient $a^{(.)}$?

Disclaimer: My stat background is not extraordinary, so this may indeed be a very stupid question after all. It's just that for my application I'd rather deal with normalized numbers than something I can't readily understand.