upper bound on cross entropy or relative entropy

2.1k Views Asked by At

Am looking for upper bounds for relative-entropy or cross-entropy for non-gaussian case . All I am aware of is bounds for entropy based on determinant of covariance matrix.

What about for relative entropy?

2

There are 2 best solutions below

0
On

There is no upper (unrestricted) bound of relative entropy. See eg here.

There are some upper bounds in terms of some features of the distributions. See eg here , here or here.

0
On

As the other commenter already stated, the KL-divergence aka relative entropy is not bounded.

As the cross entropy $H(p,q)$ can be written as $$H(p,q) = H(p) + D_{KL}(p||q)$$

it is therefore also not bounded.