Under what conditions is the $f$-divergence a distance?

133 Views Asked by At

It is well known that the total variation is a distance, whereas the KL-divergence is not. I am wondering under what conditions on $f$, the induced $f$-divergence is a distance? Or consider a weaker proposition, when will $f$-divergence satisfy the triangle inequality?