How do I show that if $\sqrt{n}(X_n - \theta)$ converges in distribution, then $X_n$ converges in probability to $\theta$?
Setting $Y_n = \sqrt{n}(X_n - \theta)$ , convergence in distribution (to a random variable $Y$) means: $P(Y_n \leq y)$ implieas $P(Y \leq y)$.
Convergence in probability requires that $P(Y_n \geq \epsilon) \rightarrow 0 $
My reasoning so far is the following. Given convergence in distribution I can use Porohov's theorem that $P(|Y_n|>M)< \epsilon$, for some positive $M$ and some positive $\epsilon$. Now, I need to show that this translates into $P(|Y_n| \geq \epsilon)=0$ and this will be convergence in probability. I'm quite stuck however, any hints are appreciated
We have: $$ \frac{1}{\sqrt{n}}\to 0\implies\frac{1}{\sqrt{n}}\overset{L}\to 0\implies X_n-\theta=\frac{1}{\sqrt{n}}[\sqrt{n}(X_n-\theta)]\overset{L}{\to} 0\implies X_n-\theta\overset{P}{\to} 0. $$ Here, $\overset{L}{\to}$ indicates convergence in distribution whereas $\overset{P}{\to}$ convergence in probability. The second implication above uses Slutsky's Theorem and last implication uses the fact that convergence in distribution to a constant implies convergence in probability to the same constant.