Let $z>0$ and $Y_n, Z_n$ be random variables with $Y_n\xrightarrow{d} \mathcal N(0,1)$ and $Z_n \xrightarrow{P} z$
Prove that for any $\epsilon>0$, $P\left(\left| Y_n + \sqrt n Z_n\right|>\epsilon \right)\to 1$
Can someone provide a proof of this claim ?
Since $Y_n$ converges in distribution, it should be "bounded", and since $Z_n$ converges to a positive constant in probability, I expect that $\sqrt n Z_n$ diverges a.s to $\infty$, but I haven't been able to formalize these ideas.
Regarding context, this question originates from the theory of statistical tests (consistency). Here's the original problem.
Let $\theta, \theta_0,x\in \mathbb R$ and $\hat {\theta_n}, X_n$ be random variables such that $\frac{\sqrt n}{X_n}(\hat {\theta_n} - \theta)\xrightarrow{d} \mathcal N(0,1)$ and $X_n \xrightarrow{P} x$.
Prove that for any $\epsilon>0$, $P\left(\left|\frac{\sqrt n}{X_n}(\hat {\theta_n} - \theta_0)\right|>\epsilon \right)$ converges to $1$ .
Upon noticing that $$\left|\frac{\sqrt n}{X_n}(\hat {\theta_n} - \theta_0) \right| = \left|\frac{\sqrt n}{X_n}(\hat {\theta_n} - \theta) + \sqrt n \frac{x}{X_n}\frac{\theta-\theta_0}{x} \right|$$
I'm brought to the problem above.
Let $X_n:=\left\lvert \frac{Y_n}{\sqrt n}+Z_n\right\rvert$. Then the sequence $\left(X_n\right)_{n\geqslant 1}$ converges to $z$ in probability essentially because $Y_n/\sqrt n$ goes to $0$ in probability, which is a consequence of Slutsky's theorem.
We have to prove that $p_n:=\mathbb P\left\{\sqrt n X_n \leqslant \varepsilon\right\}\to 0$. This follows from $$ p_n\leqslant \mathbb P\left(\left\{\sqrt n X_n \leqslant \varepsilon\right\}\cap\left\{\left\lvert X_n-z \right\vert\leqslant z/2\right\}\right)+\mathbb P\left\{\left\lvert X_n-z \right\vert\gt z/2\right\} $$ and emptyness of $\left\{\sqrt n X_n \leqslant \varepsilon\right\}\cap\left\{\left\lvert X_n-z \right\vert\leqslant z/2\right\}$ for $n$ large enough.