Say I have a sequence of probability measures $\mu_n$ with Lebesgue densities $f_n$ that converges in total variation norm to a probability measure $\mu$ with Lebesgue density $f$. Further assume that $f$ and $f_n$ are continuous and bounded, and that the entropy of each measure is finite. Can I conclude that the sequence of entropies converges to the entropy of the limiting measure? I.e. $$\int f_n(x) \log f_n(x) dx \to \int f(x)\log f(x)dx $$
I had the following idea, but now I'm stuck: Convergence in total variation norm implies weak convergence, thus by the Portmanteau Theorem we have for each fixed $k$ that $$F_{k,n} := \int f_n(x)\log f_k(x) dx \to \int f(x)\log f_k(x) dx.$$ But I've no idea if this helps with finding the limit of the "diagonal" $F_{n,n}$.
No, not necessarily. As a counterexample, take $f_n(x)=(n-|x|)/(n^2a_n)$ for $|x|\leq n$, and with $f_n(x)=0$ for $|x|>n$. If we choose the sequence $a_n$ such that $a_n \to \infty$ then the measures $\mu_n$ converge to $0$ in total variation norm because $\int_{-n}^{n} f_n(x)\, dx=1/a_n$ (and $\|\mu_n\|_{TV}=\int_{-\infty}^{\infty} |f_n(x)|\, dx$). But the entropy is $\int_{-n}^{n} f_n(x) \ln(f_n(x)) dx$. (Some people use the negative of this definition.) Using the formula for $f_n$ gives the evaluation of the integral as $(-\frac{1}{2}-\ln(n)-\ln(a_n))/a_n$. So, if $\limsup_{n \to \infty} \ln(n)/a_n>0$ then the entropies do not converge to $0$, which is the entropy of the $0$ measure. For example, choose $a_n=\ln(n+1)$ (assuming $n$ starts at $1$). Then the entropies actually converge to $-1$.