The relation between Bregman divergence and KL divergence

346 Views Asked by At

I see that Bregman divergence is defined as $d_\phi(x,y)=\phi(x)-\phi(y)-<x-y,\nabla\phi(y)>$, where $x,y\in R^d$ and $\phi$ is a strictly convex function.

KL divergence is an instance of Bregman divergence. If $p$ is a discrete probability distribution ($\sum_{i=1}^d p_i=1$), if we use negative entropy $\phi(p)=\sum_{i=1}^d p_i \log p_i$ in Bregmen divergence, then we can get KL divergence.

How about the KL divergence between two continuous distributions? Can we get KL divergence between two continuous distributions from Bregman divergence (may be varied)?