I have this exercise
Let $(Xn)$ be a sequence of r.v whose distribution is defined for $n \in N^*$ by
$P\left(X_n=1-\frac{1}{n}\right)= P\left(X_n= 1+\frac{1}{n}\right) = \frac{1}{2}$
Study its convergence in law.
======================================
The solution that I find but can't understand
For any $\epsilon > 0$ there exists an integer $N = N(\epsilon)$ such that for $n > N$ we have $\frac{1}{n} < \epsilon $ and then :
$\mathbb{P} (|X_n -1| < \epsilon) =\frac{1}{2} +\frac{1}{2}= 1$
Therefore, $\mathbb{P}(|X_n - 1| < \epsilon) \rightarrow 1$ when $n \rightarrow \infty $ which shows that the sequence (Xn) converges in probability to the certain r.v. $X = 1$, and thus also in law since the limit is a constant.
However, $\mathbb{P}(X_n = 1) = 0$ for any integer n and thus $\mathbb{P}(X_n = 1) \rightarrow 0$ when $n \rightarrow \infty$ while $\mathbb{P}(X = 1) = 1$. Thus, one must be careful in using sufficient conditions for convergence in law
======================================
My questions are :
1-how did they find that $\mathbb{P} (|X_n -1| < \epsilon) =\frac{1}{2} +\frac{1}{2}= 1$.
2-Which sufficient condition is not checked .
1- $\mathbb P(\lvert X_n -1 \rvert \leq 1/n) = 1$. Hence given $\varepsilon>0$, we have $\mathbb P(\lvert X_n-1\rvert <\varepsilon)\to 1$.
2- Convergence in probability implies convergence in distribution, and the converse is true in the special case $X_n \Rightarrow c$, for some constant $c$. Here, what they are talking about is how $\mathbb P(X_n = 1) = 0$, yet $\mathbb P(X_{\infty}=1) = \mathbb P(1=1)=1$. This is okay because $X_n \Rightarrow X_{\infty}$ means $F_{X_n}(x) \to F_{X_{\infty}}(x)$ for every point of continuity $x$ of the distribution function of $X_{\infty}$. Here, because $\mathbb P(X_{\infty}=1) >0$, $1$ is a point of discontinuity of the distribution function of $X_{\infty}$.