We have sequence of independent random variables $X_n$ that satysfies $$P(X_n=0)=\frac1n,\;P(X_n=n)=1-\frac1n.$$ I want to check convergence of this sequence in probability and in distribution.
Convergence in probability
We see that only values that $X_n$ has is $0$ and $n$. So the only candidate to a limit in probability is a random variable $X=0$. So $P(|X_n-X|>\varepsilon)=P(X_n>\varepsilon)=1-\frac1n \rightarrow1 \neq0$. So $X_n$ doesn't converge to any random variable in probability.
Convergence in distribution
The cumulative distribution function of $X_n$ is expressed by the pattern : $$ F_{X_n}(t)= \begin{cases} 0& \text{for} \;t\;\in(-\infty,0) \\ \frac1n&\text{for} \;t\;\in[0,n)\\ 1&\text{for}\;t\;\in[n,+\infty) \end{cases} $$
And now we see that if in infinity($n \rightarrow \infty$) $F_{X_n}$ will not be a cumulative distribution function ($\lim_{n \rightarrow \infty}F_{X_n} \neq1)$so it cannot converge to any random variable $X$ in distribution.
Can you please verify my proof ?