I am currently learning for my statistic final exam and I want to understand one part of the proof for the Delta method. I am going to skip some details of the proof and only ask one important question:
Let $(\Omega,\mathcal{A},P)$ be a propability space. Let $(c_n)_{n\in\mathbb{N}}$ be a sequence in $\mathbb{R}$ with $c_n\longrightarrow c$ for a $c\in\mathbb{R}$. Define the real random variable $Y_n:= c_n$. Then $Y_n$ converges in probability against the constant random variable $Y:=c$, i.e. \begin{equation} \lim_{n\rightarrow \infty}P(|Y_n-Y|>\varepsilon)=0 \end{equation} for all $\varepsilon >0$.
I want to know if that is true or the implication is true in the other direction, or is maybe an equivalence. Thanks for your answers.
This is true, because in fact $Y_n\to Y$ almost surely, which implies convergence in probability.
The reverse direction is also true. Suppose $Y_n\to Y$ in probability. Then note that since $Y_n$ and $Y$ are a.s. constant, $$P(|Y_n-Y|>\epsilon) = \begin{cases} 1 & \text{ if } |c_n-c|>\epsilon\\ 0 & \text{ if } |c_n-c|\leq \epsilon. \end{cases}$$
Now let $\epsilon>0$ be given. Then since the above expression converges to $0$ as $n\to \infty$, we have that for $n$ large enough $P(|Y_n-Y|>\epsilon)=0$ (try to prove this yourself). By the above expression, this implies that $|c_n-c|\leq\epsilon$ if $n$ is large enough.