Convexity of difference of KL

53 Views Asked by At

For probability measures $\mu$, $\nu$ supported on $R^2$ with $\mu \ll \nu$, consider the Kullback-Leibler divergence $$ \text{KL}(\mu | \nu) = \int \log \frac{d\mu}{d\nu}(x) d\nu(x), $$ which is known to be convex.

Consider a map $\pi(x,y) = x$ then I conjecture that the function $$ J(\mu) = \text{KL}(\mu|\nu) - \alpha \text{KL}(\pi_\sharp \mu | \pi_\sharp\nu), $$ is convex in case $\alpha \leq 1$, but I have trouble proving this. I would be grateful for hints or references.

EDIT: The intuition is that the negative curvature of the second KL term is absorbed by the positive one of the first term.