Problem. Let $f(x), g(x)$ be probability densities defined on $\mathbb{R}^n$ with $f(x), g(x)>0$ for all $x$. Show that $$ E_f(\log f(x)) \ge E_f (\log g(x)) $$ with $ E_f(h(x)) = \int_{-\infty}^{\infty} h(x) f(x) \, dx$ is the expectation of $h(x)$ with respect to the density $f(x)$.
We are allowed to use Jensen's inequality. Since $\log$ is concave, we have that
$$ \log\int_{-\infty}^{\infty} f(x) f(x) \,dx \ge \int_{-\infty}^{\infty} \log f(x) f(x) \, dx = E_f(\log f(x)) $$
and
$$ \log\int_{-\infty}^{\infty} g(x) f(x) \, dx \ge \int_{-\infty}^{\infty} \log g(x) f(x) \, dx = E_f(\log g(x)) $$
from this post but I couldn't get past this.
Your question is equivalent to the non-negativity of the KL divergence, see below, I hope this is helpful.
\begin{align*} D_{KL}(p\parallel q) &=\mathbb E_q\left[-\log\frac{p(x)}{q(x)}\right] \\ &\geq -\log \mathbb E_q\left[\frac{p(x)}{q(x)}\right] \\ &=-\log\int q(x)\frac{p(x)}{q(x)}\mathrm dx \\ &=-\log\int p(x)\mathrm dx=-\log1=0 \end{align*}