The KL divergence is not a distance metric due to the fact that it violates the triangle inequality and since it is not symmetric. However, consider the following problem. Let $A$ and $B$ be two discrete probability distributions with full support. Let
$$D(A||B) = \delta$$
The probability distribution sequence $\{C_t\}$ satisfies the following fact: For any $\epsilon>0$, there exists $N$ such that $\forall t\geq N$
$$D(A||C_t) \leq \epsilon$$
Can I say anything about a lower bound on $D(B||C_t)$ for $t\geq N$? The intuition is that if $A$ and $B$ are "at least $\delta$ apart" and if $C_t$ can be arbitrarily "close" to $A$, then it must be that $B$ is not too "close" to $C_t$. However I am not sure if this intuition is a) correct or b) can be made precise.
As I show below, there is no way to bound $D(B\Vert A)$ in terms of $D(A\Vert B)$: their difference can be arbitrarily large. This means that even in the extreme case of $\epsilon=0$ so that $C_t=A$, nothing can be said about $D(B\Vert C_t)$.
To see that the asymmetry in KL can be unbounded, consider the simple binary case where $x \in \{0,1\}$ and $$A(0)=1-\epsilon e^{-1/\epsilon^2}, A(1)=\epsilon e^{-1/\epsilon^2},\quad B(0) = 1-\epsilon, B(1) = \epsilon.$$ We will employ the following inequalities, which are derived using that $A(0)\le 1$ and $\ln (1+x) \ge x/(1+x)$: \begin{align} A(0) \ln \frac{A(0)}{B(0)} & \le - \ln B(0) = -\ln (1-\epsilon) \le \frac{\epsilon}{1 -\epsilon}\\ B(0) \ln \frac{B(0)}{A(0)} & \ge B(0) \ln B(0) = (1-\epsilon) \ln (1-\epsilon) \ge -\epsilon \end{align} Then, \begin{align} D(A\Vert B) & = A(0) \ln \frac{A(0)}{B(0)} + A(1) \ln \frac{A(1)}{B(1)} \le A(0) \ln \frac{A(0)}{B(0)} \le \frac{\epsilon}{1-\epsilon}, \end{align} where we use that $A(1) \le B(1)$. In the other direction, \begin{align} D(B\Vert A) & = B(0) \ln \frac{B(0)}{A(0)} + B(1) \ln \frac{B(1)}{A(1)} \ge -\epsilon + B(1) \ln \frac{B(1)}{A(1)} = -\epsilon + 1/\epsilon. \end{align}