Prove or provide a counterexample for the following claim: Let $(X_n,Z_n)$ be a pair of real valued random variables for which $X_n \sim N(0,1)$ for each $n$, and $Z_n \geq 0$ almost surely. Then for any sequence of numbers $a_n \in \mathbb{R}$, \begin{align*} \mathbb{P}(a_n - 1/n < X_n+Z_n < a_n) \not\to 1. \end{align*}
The statement seems plausible. I've tried considering the case when $Z_n = 1_B$ for some event $B$. Then \begin{align*} \mathbb{P}(a_n-1/n<X_n+1_B<a_n) = \mathbb{P}(a_n-1-1/n < X_n < a_n-1 \mid B)\mathbb{P}(B) + \mathbb{P}(a_n-1/n<X_n<a_n\mid B^c)\mathbb{P}(B^c). \end{align*} If there is a sequence $(a_n)$ for which the above tends to $1$, then that implies both \begin{align*} \mathbb{P}(a_n-1-1/n < X_n < a_n-1 \mid B) \to 1 \\ \mathbb{P}(a_n-1/n < X_n < a_n \mid B^c) \to 1, \end{align*} so that $X_n$ falls into an interval of length $1/n$ if either $B$ or $B^c$ happens. I thought this could be used to argue that, conditional on either of these events, the variance of $X_n$ tends to zero, and then use this with the assumption $\text{Var}(X_n)=1$ to get a contradiction, but I don't see how to make this precise. Any hints would be appreciated.