This is from the book Pattern Recognition and Machine Learning by Christopher Bishop.
The author states the following form of Jensen's inequality:
$f\left(\int{xp(x)dx}\right) \leq \int{f(x)p(x)dx}$
where $f$ is a convex function. It is not explicitly stated but I believe $p(x)$ is a probability distribution here.
The author goes on to say that from this inequality, it follows that
$-\int{p(x)\log{\frac{q(x)}{p(x)}}} \geq -\ln{\int{q(x)dx}}$
where $q(x)$ and $p(x)$ are probability distributions. Despite my best efforts I have not been able to understand how this is a direct application of the above version of Jensen's inequality, due to the presence of the $q(x)$ terms and the lack of an $x$ on the right hand side.
Can anyone shed some light on this?
My best guess: we may(?) extend the above equality to state that $$ f\left(\int{g(x)p(x)dx}\right) \leq \int{f(g(x))p(x)dx} $$ So, letting $g(x) = q(x)/p(x)$, we have $$ -\ln\left(\int q(x)dx\right) = -\ln\left(\int \frac{q(x)}{p(x)} \cdot p(x) \,dx\right) \leq - \int p(x) \ln \left( \frac{q(x)}{p(x)} \right)dx $$