We have a sequence of probability measures $P_n, n≥1$, and a probability measure $Q$ such that $KL(P_n || Q) < v$ for each $n$, where $v$ is a finite constant and $KL(P||Q)$ is the Kullback–Leibler divergence, i.e. $KL(P||Q) ≔ ∫\ln\frac{dP}{dQ}(x)P(dx)$ whenever $P ≪ Q$ and the integral is finite.
We can assume that all probability measures are on the real line with the usual Borel σ-algebra (I am interested in the case of Polish spaces, but I suspect it does not make a difference here).
Does it follow that $P_n, n≥1$ is (uniformly) tight? That is, for each $ε > 0$ there is a compact $K$ such that $P_n[K^c] < ε$ for each $n$.
Since $P << Q$ By Radon Nikodyn Theorem: $P(x) = f(x) Q(x)$.
If we assume $\sup_x f(x) < \infty$ then : $$P(K^c)=\int_{K^c} P(x) dx = \int_{K^c} f(x) Q(x) dx \leq sup_x f(x) \times \int_{K^c} Q(x) dx = const \times Q(K^c) <\epsilon .$$
If we assume $\int |f(x)|^2 < \infty$ and $Q(x) \leq 1$ then: $$ P(K^c)=\int_{K^c} P(x) dx =\int_{K^c} f(x) Q(x) dx \leq \sqrt{\int_{K^c} |f(x)|^2 dx \times \int_{K^c} Q^2(x) dx} <\epsilon .$$
Please also see: Equivalent ideas of absolute continuity of measures
Let $P_n(x) = f_n(x) Q(x)$ similar to above: Now we have, following bound uniformly: $$D(P_n || Q) = \int f_n(x) \log(f_n(x)) Q(x) dx < v $$ Try proving one of the above assumptions on $f_n(x)$ from above KL divergence inequality uniformly. Let me know if this is useful.