My question comes from the proof of the Delta method. One of the conditions states that $\sqrt{n}(Y_n - \theta) \rightarrow N(0,\sigma^2)$ in distribution for some sequence of random variables $Y_n$. The proof then uses the consequence that $Y_n \rightarrow \theta$ in probability. It is not clear to me how to prove this. This is my attempt:
By definition we want to show that for $\epsilon > 0$, $\lim_{n \rightarrow \infty} P(|Y_n - \theta| > \epsilon) = 0$.
We simplify this as $P(|Y_n - \theta| > \epsilon) = 1 - (P(\sqrt{n}|Y_n - \theta| < \sqrt{n} \epsilon) - P(\sqrt{n}|Y_n - \theta| < -\sqrt{n}\epsilon))$.
I know that for $t \in \mathbb{R}$, $\lim_{n\rightarrow \infty} P(\sqrt{n}|Y_n -\theta| < t) = \Phi(t)$ where $\Phi$ is the standard normal pdf. But how do I proceed if $t$ is also a function of $n$?
It is tempting to say that if $F_n$ is the CDF of $\sqrt{n}|Y_n-\theta|$, then $F_n(\sqrt{n} \epsilon) \rightarrow 1$ as $n \rightarrow \infty$ due to properties of the CDF but where was the fact that the limiting distribution is normal is used when this could apply to any distribution?
Let $\epsilon >0$ and $M \in (0,\infty)$. Then $| (Y_n -\theta)| >\epsilon$ implies that $\sqrt n|Y_n -\theta| > \sqrt n \epsilon >M$ provide $n$ is large that $\sqrt n \epsilon >M$. Hence $\lim \sup P(|Y_n-\theta| >\epsilon) \leq P(|X| >M)$ where $X \sim N(0, \sigma^{2})$. You can make $P(|X| >M)$ as small as you wish by choosing $M$ large enough. Can you finish?